Limiting Concurrent Goroutine Execution
Consider a scenario where you have a list of URLs to process and want to limit the number of concurrent goroutines running. For instance, if you have 30 URLs, you may only want 10 goroutines working in parallel.
The provided code attempts to use a buffered channel of size parallel to limit the number of goroutines running. However, this approach doesn't seem to block after processing all the URLs. A more effective way to achieve this concurrency limitation is to create a specified number of worker goroutines and feed them URLs through a dedicated channel.
Here's an improved version of the code:
parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
urls := flag.Args()
// Create a channel to hold URLs that workers will consume
workerURLChan := make(chan string)
// Start a goroutine to feed URLs to the workers
go func() {
for _, u := range flag.Args() {
workerURLChan In this updated code, we create a worker goroutine for each allowed concurrent execution, and these workers fetch URLs from the dedicated channel. Once all URLs have been distributed, the workerURLChan is closed, which triggers the workers to exit when their current URL is completed. This mechanism effectively limits the number of concurrent goroutines running.
Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.
Copyright© 2022 湘ICP备2022001581号-3