Pseudo hacker bargain hunting software stack + Bonus VPN

AVForums

Help Support AVForums:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

2wice

AVForums Grandmaster
*
Joined
Apr 8, 2013
Messages
3,132
Reaction score
263
Location
South Africa
I absolutely despise reading how some people beat me to a deal on classified website by mere minutes.

I've been looking for a method to "beat" other hunters (apologies, might include some of you) to the deals and I've managed to do so.

It would be counter-intuitive to show the whole world how to do this by spoon and shaft myself in the process, so I'll broadly explain how to do this and so give other hunters of means a sporting chance  :tongue:.

At first I used to use this method on my home PC's, but having a PC running 24/7 makes no sense any more so after some searching I moved it all to a cloud based server.

First the cloud based server.

Amazon Web Service http://aws.amazon.com/ is running a 1 year free service for cloud based servers.

Open an account, needs credit card.

Once that sorted, might as well set up a free VPN for your self. Click on Cloud formation --> Create Stack.

Then under Template, Source, select ?Specify an Amazon S3 template URL? and paste in this URL https://s3.amazonaws.com/webdigi/VPN/Unified-Cloud-Formation.json and then click Next.

Keep going through filling in data as you go along.

Once the stack is created go to the outputs tab, your VPN IP will be there.

You get 15 gigs of trafic, but I find most US web services have a long authentication periods, so log in through the VPN and as soon as you are connected log out the VPN. Services will still be running, but on your normal data.

Now you can go back to the resource group and click on EC2 Virtual Servers in the Cloud.

Click on Launch Instance, any one of the free tier eligible options will work including the Windows server options.

I use an Ubuntu server instance, because it is easier for me. This will also work on windows but there are more hoops to jump through.

I'm also not going to explain how to setup an instance. It's not that hard.

Remember to create a billing monitor so you don't get a nasty bill shock in $

Do not run more than one instance at a time. You can create as many as you want but the free service is only enough to run 1 instance all year or 2 for half the year etc.

Once you have the instance up and running create a security key pair and keep somewhere safe.

SSH into the instance using your created key.

Install the following software;

Urlwatch https://thp.io/2008/urlwatch/

Beautiful Soup http://www.crummy.com/software/BeautifulSoup/bs4/doc/

You are going to have to do this manually from the command line, not from the repositories (not updated). HAHAHAHAHAH! Good luck.  :BWAHAHAH:

READ EVERYTHING to understand how this works.

Setup urlwatch by creating urls.txt in ../.urlwatch and putting the urls you want to monitor in there.

Create and edit [../.urlwatch/lib/hooks.py]

Code:
from bs4 import BeautifulSoup

def filter(url, data):
        if '*******' in url:
                soup = BeautifulSoup(data)
                if soup.find('p', {'class': 'pagination-results-count'}):
                        return str(soup.find('p', {'class': 'pagination-results$
                else:
                        return data
        else:
                return data
****** is any unique identifier in the url you put in urls.txt, this is so you can monitor more than one page at a time.

This is the tricky bit, go to the page you want to monitor and and view the source of the document, cross your fingers it comes out as normal HTML code, if not you are going to have to use the supplied tools to reformat to HTML code.

Now you have to find a string text that will change when a new ad is posted, find the tags associated with that string text and pass it to the beautifulsoup function as above.

If you've done everything right, you will get a console output of any changes each time urlwatch is run. Setup a cron job for the interval you want and pass your SMPT email details to urlwatch parser so you can get an email every time a page changes.

Code:
ubuntu@ip-172-31-18-167:~/urlwatch-1.18$ python3 urlwatch
***************************************************************************
CHANGED: http://www.********.co.za/s-thorens/*******
***************************************************************************
--- @   Thu, 26 Mar 2015 11:02:33 +0000
+++ @   Thu, 26 Mar 2015 11:04:52 +0000
@@ -1,2 +1,2 @@
-<p class="pagination-results-count"><span>Results 1 to 7 of 7 ads</span>
+<p class="pagination-results-count"><span>Results 1 to 6 of 6 ads</span>
 </p>
***************************************************************************
-- 
urlwatch 1.18, Copyright 2008-2015 Thomas Perl
Website: http://thp.io/2008/urlwatch/
watched 1 URLs in 0 seconds

Output looks like this, this output was when I removed my test ad.

No more tears as other people get the sweet deals.

Hope none of you get this working  :tongue:
 
Top