After further research, I've realized that my initial idea is not viable. What I will need is a bit more complicated. The program will need to be able to launch a specific URL, download the contents of the page searching for a particular phrase. If it finds that phrase it would need to "click" on the link associated with that phrase (the phrase would be as link text), if it is not possible to have it automatically click on the link, then the program would need to record the link information and send it with the HTTP Referer information of the page on which the link was found. If the particular phrase was not found on that page, the program would need to move to the next page (as if moving from one set of search engine results to another looking for the proper listing on each one). It would need to be designed so that there are set missions. Starting with one URL, and completing the above task, then moving on to the next url, then the next then the next, etc. to the end of the mission list.
## Deliverables
The program will need to accept and handle cookies properly but not store them. Only use them during a given session then deleting them when done (when finished with one URL and before moving on to the next in a given mission) In addition, it will need to be schedulable (i.e., launch this mission at these specified times and continue till done, , launch mission 2 at these specified times, mission 3, mission 4, etc.) One additional thing I didn't mention in the update. It will need to have a preset number of generic browser/OS configurations that it sends out. Randomly switching between the different ones for each successive URL it is working on. I.E. one time it sends I.E. 5.0 on a Win98 sys. the next a netscape/WINNT simulation etc., the next something else etc.