-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feat] Be able to pass a timeout param to the endpoints #59
Comments
@nickscamara |
@ezhil56x all yours! |
@nickscamara |
Hi, Is this issue still open, or is someone working on it? |
@parthusun8, the issue is still open, but fixing it would need us to make some real complex changes to our bull queue system to allow the |
@nickscamara should we close this for now? |
Je peux être affecté in the work |
/attempt #59 In the scrape endpoint, we use the scrapeUrl function and pass the timeout value as an option. If the scrape operation times out, we catch the TimeoutError and return a JSON response with a status code of 408 (Request Timeout). In the crawl endpoint, we use the crawlUrl function and pass the timeout value as an option. If the crawl operation times out, we catch the TimeoutError and return a JSON response with a status code of 408 (Request Timeout). We also add a message to each page in the response indicating that the crawl timed out. |
@akay41024: Another person is already attempting this issue. Please don't start working on this issue unless you were explicitly asked to do so. |
The queue management system (BullMQ) does not support this feature for |
enable the user to pass a "timeout parameter" to both the scrape and the crawl endpoint. If the timeout is exceeded, please send the user a clear error message. On the crawl endpoint, return any pages that have already been scraped but include messages notifying them that the timeout was exceeded.
If the task is completed within two days, we'll include a $10 dollar tip :)
This is an intro bounty. We are looking for exciting people that will buy in so we can start to ramp up.
The text was updated successfully, but these errors were encountered: