Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
839 views
in Technique[技术] by (71.8m points)

angular - How to send 1000 XHTTP Requests with maximum number of parallel requests

I have an Angular application, which needs to send N XHTTP requests, where 1 <= N <= 10000.

The application needs to handle it as fast as possible, so preferably there should be multiple active XHTTP requests at the same time, with a slide-window of multiple requests at the same time. Using WebSocket, or other streaming-like solution is not possible due to server-side API limitations.

My first idea was to use something like RxJS forkJoin, but I struggle to limit the concurrent requests number. As far as I know, there are API limitations for max requests number, for instance Chrome will allow only 8 simultaneous requests.

Most of the solutions/tutorials I found either a.) does not limit the maximum number of concurrent connections or b.) does not update dynamically (timeout solutions are not efficient for this task).

For instance:

const test = () =>
  request(`https://swapi.co/api/people/1/`)
    .pipe(
      delay(1000),
      switchMap(response => from(response.films)),
      concatMap((url: string) => request(url).pipe(delay(1000))),
      scan((acc, res) => [...acc, res.title], []),
      tap(console.log)
    )
    .subscribe()

is not good for me, as the limitation is achieved by delay, but I would like to achieve something like a thread based solution: there are maximum of Y number of concurrent connections, and if one finishes, a new request starts immediately.

const test = () =>
  request(`https://swapi.co/api/people/1/`)
    .pipe{
      switchMap(response => from(response.films)),
      specialOperatorIAmLookingFor((url: string) => request(url), 8),   // where '8' is the maximum number of paralell requests
      scan((acc, res) => [...acc, res.title], []),
      tap(console.log)
    )
    .subscribe()

Any ideas how to solve this nicely? RxJS feels like there should be a solution for this already written.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You could try to use RxJS bufferCount and concatMap operators along with forkJoin().

From bufferCount docs:

Collect emitted values until provided number is fulfilled, emit as array.

So it collects n number of notifications and emit it as an array. We could then pass the array through forkJoin() for n number of parallel requests.

Try the following

I assume this.urls is a collection of HTTP requests similar to

urls = [
  this.http.get('url1'),
  this.http.get('url2'),
  this.http.get('url3'),
  ...
];

Then the requests triggering code would look like

bufferedRequests() {
  from(this.urls).pipe(
    bufferCount(6),      // <-- adjust number of parallel requests here
    concatMap(buffer => forkJoin(buffer))
  ).subscribe(
    res => console.log(res),
    err => console.log(err),
    () => console.log('complete')
  );
}

According to this comment by a Chromium engineer, actual solution to the max. connections to host/domain limit would be to use WebSockets or Domain sharding. But since you mention it isn't possible in your environment, you could use the buffered request workaround.

However I wouldn't try to buffer to the max limit. If you were to send more requests to the same domain than the max allowed, you could see that the additional requests would actually throttle behind till the above requests are finished. So say if were to buffer to the max allowed limit, and your application sends an additional request to the same domain from somewhere else on which the app workflow depends on, the entire app could throttle.

So it's better to either use WebSockets or Domain sharding. And if neither is possible it's better to buffer the requests to a number of requests less than* the max allowed limit.

* Obviously, if you're 100% percent sure no other requests will be triggered to the same domain during the buffering procedure, then you could buffer to max. allowed limit.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...