In the world of software development, efficiency is key. Go, renowned for its robust concurrency features, offers an exciting way to enhance program performance. As a fan of Go, I’m going to delve into a practical use case for goroutines and channels to streamline the process of fetching data from APIs.
The Basics of Go Concurrency
Concurrency in Go is a standout feature, primarily due to the simplicity and efficiency of goroutines (lightweight threads) and channels (conduits for data). These tools enable the execution of multiple tasks simultaneously, utilizing all available resources and significantly reducing execution time.
Case Study: Fetching Stock Price Data
Imagine a scenario where we need to fetch stock price data for multiple companies. Traditionally, this would involve sequential API calls for each company – a time-consuming process. By employing Go’s concurrency, we can transform this process into a more efficient one.
Objective
Our goal is to compare the performance of fetching real-time stock prices for META, AMZN, and AAPL using sequential API calls versus concurrent calls with goroutines.
Hypothesis
We anticipate that the concurrent approach will significantly reduce the total time taken to fetch all stock prices compared to the sequential approach. This efficiency is attributed to the parallel execution of HTTP requests in goroutines, which allows for multiple API calls to be made simultaneously.
API Setup
For this example, we’ll use the Alpha Vantage API.
Approach #1: Sequential
First, let’s write a simple Go program to fetch stock prices sequentially. We will also track the total time it takes to execute the main function. This will serve as the primary performance metric.
package main
import (
"fmt"
"io"
"net/http"
"time"
)
const apiKey = "test_key"
func fetchStockPrice(symbol string) string {
url := fmt.Sprintf("https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol=%s&apikey=%s", symbol, apiKey)
response, err := http.Get(url) // Make a GET request to the Alpha Vantage API
if err != nil {
panic(err)
}
defer response.Body.Close()
body, err := io.ReadAll(response.Body) // Read the response body
if err != nil {
panic(err)
}
return string(body) // Return the response body as a string
}
func main() {
start := time.Now() // Record the start time
companies := []string{"META", "AMZN", "AAPL"}
// Fetch the stock price for each company
for _, company := range companies {
price := fetchStockPrice(company)
fmt.Println(company, price)
}
fmt.Println("Time taken: ", time.Since(start)) // Log the total time taken to execute the API calls
}
Output: Approach #1
The API returned the latest stock price and volume information for each company in the form of JSON. We also logged the time it took to complete each set of API calls. I ran the code five times and received five results.
META {
"Global Quote": {
"01. symbol": "META",
"02. open": "459.6000",
"03. high": "485.9600",
"04. low": "453.0100",
"05. price": "474.9900",
"06. volume": "84707646",
"07. latest trading day": "2024-02-02",
"08. previous close": "394.7800",
"09. change": "80.2100",
"10. change percent": "20.3176%"
}
}
AMZN {
"Global Quote": {
"01. symbol": "AMZN",
"02. open": "169.1900",
"03. high": "172.5000",
"04. low": "167.3300",
"05. price": "171.8100",
"06. volume": "117218313",
"07. latest trading day": "2024-02-02",
"08. previous close": "159.2800",
"09. change": "12.5300",
"10. change percent": "7.8666%"
}
}
AAPL {
"Global Quote": {
"01. symbol": "AAPL",
"02. open": "179.8600",
"03. high": "187.3300",
"04. low": "179.2500",
"05. price": "185.8500",
"06. volume": "102551680",
"07. latest trading day": "2024-02-02",
"08. previous close": "186.8600",
"09. change": "-1.0100",
"10. change percent": "-0.5405%"
}
}
Time taken: 271.079ms
.
.
.
Time taken: 245.602208ms
.
.
.
Time taken: 221.999959ms
.
.
.
Time taken: 246.19975ms
.
.
.
Time taken: 253.505416ms
Approach #2: Concurrency with Goroutines
Now, let’s modify the fetchStockPrice
function to include a channel and a WaitGroup, which will block the function’s execution until all goroutines are finished. This will allow each GET
request to occur in a separate goroutine. We’ll still make the same GET
request to the API, but we will now pass each result to the channel, wait for all goroutines to finish, close the channel, and then read the results from the channel.
package main
import (
"fmt"
"io"
"net/http"
"sync"
"time"
)
const apiKey = "test_key"
// Added new function parameters: 1) a channel, 2) a pointer to a WaitGroup
func fetchStockPrice(symbol string, ch chan<-string, wg *sync.WaitGroup) string {
defer wg.Done() // Ensure that the WaitGroup is closed when the function exits
url := fmt.Sprintf("https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol=%s&apikey=%s", symbol, apiKey)
response, err := http.Get(url)
if err != nil {
panic(err)
}
defer response.Body.Close()
body, err := io.ReadAll(response.Body)
if err != nil {
panic(err)
}
ch <- fmt.Sprintf("%s: %s", symbol, body) // Send the result to the channel
return string(body)
}
func main() {
start := time.Now()
companies := []string{"META", "AMZN", "AAPL"}
ch := make(chan string, len(companies)) // Create a buffered channel
var wg sync.WaitGroup // Create a WaitGroup
// Launch a goroutine for each company
for _, company := range companies {
wg.Add(1) // Increment the WaitGroup counter
go fetchStockPrice(company, ch, &wg) // Pass the function into a goroutine
}
// Wait for all the goroutines to finish and close the channel
go func() {
wg.Wait()
close(ch)
}()
// Read the results from the channel
for result := range ch {
fmt.Println(result)
}
fmt.Println("Time taken:", time.Since(start))
}
Output: Approach #2
The API returned the same data as Approach #1, with the following set of execution times:
Time taken: 178.084291ms
.
.
.
Time taken: 178.35575ms
.
.
.
Time taken: 150.610625ms
.
.
.
Time taken: 187.035958ms
.
.
.
Time taken: 154.888458ms
Results
The implementation of concurrency in Approach #2 demonstrated a remarkable improvement in execution time. The concurrent approach took an average of 169.80 milliseconds per request, while the sequential approach averaged 247.68 milliseconds per request. This equates to an impressive 31.45% reduction in time (77.88 ms)!
Unlocking New Levels of Performance with Go Concurrency
In our exploration of Go’s concurrency capabilities, the evidence is clear: employing goroutines and channels can create substantial improvements in processing efficiency. For developers and enterprises alike, embracing these features of Go can unlock new levels of performance, enabling more effective use of computing resources and offering an edge in the dynamic world of software development.