Golang中Go例程的开销
So I understand go routines have low overhead, but I'm wondering just how good they are. If I have a server that handles incoming messages how does creating a new go routine to process each incoming message compare to using the standard producer/consumer model with channels?
Like is it reasonable to have a high performance go server that spawns new go routines for ever incoming request.
因此,我了解go例程的开销很低,但我想知道它们的性能如何。 如果我有一个处理传入消息的服务器,那么与使用带有通道的标准生产者/消费者模型相比,如何创建一个新的go例程来处理每个传入消息呢? p>
高性能的go服务器,它为传入的请求生成新的go例程。 p> div>
The built-in http
package uses a goroutine for each connection and there are numerous benchmarks showing it being able to handle thousands of concurrent users. So unless you have many connections with many messages each, I'd say it's reasonable to create a new goroutine for each message. And in any case, go has a nice benchmarking feature you can use to verify your assumptions.
We were actually doing stress testing with a similar approach. We spawned new threads for every http request. The concurrency was so good than within 10 seconds we had reached something like 100,000 requests. The only bottle neck you might face is memory because if all the processing isn't fast enough then you might run out of memory for that process.
I'm quite sure there is a work around for it, but that is one reason why you would want to throttle and nor spawn unlimited go routines. The problem we faced is we were trying to get data from another API which was just not able to keep up with this level of concurrency. Hence the throttling suggestion earlier.