Skip to content

Commit 44880a2

Browse files
committed
Adding PyPI run instructions
1 parent 445e70e commit 44880a2

File tree

1 file changed

+32
-0
lines changed

1 file changed

+32
-0
lines changed

README.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,38 @@ After collecting the proxy data and filtering the slowest ones it is randomly se
3131
The request timeout is configured at 30 seconds and if the proxy fails to return a response it is deleted from the application proxy list.
3232
I have to mention that for each request a different agent header is used. The different headers are stored in the **/data/user_agents.txt** file which contains around 900 different agents.
3333

34+
## How to use
35+
36+
The project is now distribured as a PyPI package!
37+
To run an example simply include **http-request-randomizer==0.0.5** in your requirements.txt file.
38+
Then run the code below:
39+
40+
````python
41+
import time
42+
from http.requests.proxy.requestProxy import RequestProxy
43+
44+
if __name__ == '__main__':
45+
print "Hello"
46+
start = time.time()
47+
req_proxy = RequestProxy()
48+
print "Initialization took: {0} sec".format((time.time() - start))
49+
print "Size : ", len(req_proxy.get_proxy_list())
50+
print " ALL = ", req_proxy.get_proxy_list()
51+
52+
test_url = 'http://icanhazip.com'
53+
54+
while True:
55+
start = time.time()
56+
request = req_proxy.generate_proxied_request(test_url)
57+
print "Proxied Request Took: {0} sec => Status: {1}".format((time.time() - start), request.__str__())
58+
if request is not None:
59+
print "\t Response: ip={0}".format(request.text)
60+
print "Proxy List Size: ", len(req_proxy.get_proxy_list())
61+
62+
print"-> Going to sleep.."
63+
time.sleep(10)
64+
````
65+
3466
## Contributing
3567

3668
Contributions are always welcome! Feel free to send a pull request.

0 commit comments

Comments
 (0)