*This is a repeat of an twitter thread I made to test a script*
We are going to go through the process of building my \"URL Graber\"
Script. I wanted to collect a bunch of links I had posted one night
without having to manually go back through my timeline. I built this in
python but the midset can be used in any language. Anyway let\'s get
this done\...
1.  You sit and list out the steps you want. I prefer a whiteboard for
    this but paper cool. I knew I wanted to search my tl-\>find tweets
    with links-\>grab the links for a file
2.  Do a lil research into the packages you might need. Twitter api
    means I need tweepy
3.  Read the documentation. This is truly a shocking step coming from me
    but it helps you know know what your looking for, what creds you
    might need before hand, and get a sense of initial setup. In the
    example I know I wanted to search a user\'s (mine) personal timeline
    and I needed to figure out where the links are in the result object
4.  Test for data. So at this point I will make sure my creds work and
    just try to pull whatever data I'm going to parse. Try to be as
    quick as possible with `ctrl-c` if your going to be getting mutiple
    objects back, and maybe open the results in a text file so it\'s
    better for your eyes.
5.  Understand how results are returned. Now a good api tells you this
    so you don't have to strain your eyes but everyone isn't good. Make
    sure you know how to parse the big blob of data
6.  Start cleaning up. Now that you know what you want test it out and
    make sure it returns the way you want. By this point I got the
    tweets coming back in order but I needed to get the urls so I tested
    a few things and cleaned up how my data was printed. I realized I
    only need the tweet text and then the url with it so I wasn\'t
    guessing at the links.
7.  Delete those unused print statements. Unless your using loging and
    can easily turn it off go back thru and delete all print statements
    that aren't what you want to see in your console. Do this because
    when you do security work you realize this is the weakest link for
    devs. We console log everything possible.
8.  Prosper. Thats really it folks. So to summarize: write it
    out-\>initial research-\>read docs-\>mini test-\>verify
    results-\>refractor/cleanup-\>secure coding cleanup
You can find the completed code for this script
[here](https://github.com/Keheira/Python/blob/master/urlGrab.py)