Max retries exceeded with URL in requests
$100-200 USD
货到付款
Hey folks
I am making a crawling script that brings a large amount of data through a proxy server in Python.
Network communication is being implemented by utilizing requests library and I am facing "HTTPConnectionPool(host='host', port=port): Max retries exceeded with url" issue now.
This proxy can be regarded as a proxy server that automatically changes IP in the cloud, and the IP change cycle is to change IP for each request.
The proxy concurrency is 5 times every second.
Anyone knows why did it happen, could fix it, it would be appreciate.
Thanks.
项目ID: #32681853
关于项目
有9名威客正在参与此工作的竞标,均价$140/小时
Hello employer, I have theory regarding your issue. I have faced such issues while developing many scrapers. You can message me to discuss possible solution to your project and if it is simple solution, we can negotia 更多
Hi there, I have gone through your requirement to scrape lots of websites. I am EXPERT in building scraping tools /scripts. Hence, I can SURELY work on your project. I am having 4 YEARS of EXPERIENCE in developing PH 更多
Hi there, I would like to work and complete your project in just €100 EUR with further details. Regards, Adnan Ali.
Hello, If you are looking for some assistance or guidance in python, AI /ML, Backend, Flask, Django, Web Scraping and anything at all, we are the organization you are looking for. Our team has been working in this fiel 更多
Hi there, Can i see the code to better understand what can be the cause of problem? I have Decent knowledge of Python and can help you with the issue. Can we have a chat to discuss it further in detail?
Hi, There I read through the job details extremely carefully and I am absolutely sure that I can do the project very well. I have good experience with Python, Web Scraping, so can help you very well. I work according 更多