Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
430 views
in Technique[技术] by (71.8m points)

c# - How to increase the outgoing HTTP requests quota in .NET Core?

I'm trying to send high volume HTTP requests from a machine. But it seems that .NET Core, or Windows I don't know, is restricting the number of concurrent HTTP requests that can go out, or the quota of HTTP requests in a given time fraction.

How can I increase this? I remember that we had a configuration in .NET Framework, but I'm unable to find that either.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The HTTP 1.1 protocol advised that only 2 concurrent requests should be made per domain. Both the .NET Framework and .NET Core use this limit for desktop applications. ASP.NET applications have a limit of 10 concurrent requests. Both runtimes allow you to change the limit.

This limit made sense for browsers a while ago but it's too restrictive for service oriented applications. Browsers allow around 8 concurrent connections nowadays and service/REST applications can handle more.

ServicePointManager.DefaultConnectionLimit can be used to change the limit for the entire application, eg :

ServicePointManager.DefaultConnectionLimit = 100;

You can also specify a limit per HttpClient instance, by using an HttpClientHandler with the HttpClientHandler.MaxConnectionsPerServer property set to the desired limit :

var handler = new HttpClientHandler
{
    MaxConnectionsPerServer= 100,
    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
};

HttpClient client = new HttpClient(handler);

This way you can set different limits per target service.

Don't rush and set the limit to a huge number. The target services may not be able to handle 20 or 40 concurrent requests from the same client. Badly written services may crash or flood the server. Concurrent requests may block each other, reducing the actual throughput. Well written services may impose a rate limit per client, or queue requests.

You'd be surprised how badly some supposedly high-traffic services behave. I've encountered airline services that could crash if more than just 10 concurrent requests were made over a minute. Badly configured load balancers would still direct traffic to those crashed services for at least 1-2 minutes until the service restarted, making retries meaningless.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...