AGU13 - like whoa

Dec 11 2013 Published by under dissertation

So I said I wouldn't analyze the data from '13 because I'm already underwater, I have plenty, and I need to get done. However, I figured that I had already figured out oauth and using twitteR so no harm in running a couple commands, stashing the data somewhere, and maybe pulling it out if there's a specific question or maybe later when turning my dissertation into an article (should I live so long!).

I thought well, it will give me about 2 weeks worth, but maybe I should give it a try while the conference is still going to make sure everything works ok.  Well crap. I'm getting anywhere from 99 to 1000 tweets per query... and that's covering like at most 3 hours... and I can't seem to fill in the rest. Bummer.

The search has a sinceid but no untilid... and it has since and until for dates - but full days not down to the hour or minute or anything. So I'm really able to get 9pm-midnight GMT. Huh.

I watched Kim Holmberg's fabulous webinar today, so I'm going to try something he suggested to see if that helps. Otherwise, I kinda need to run the search throughout the day, which I can do if I work from home, but I will have missed the most important days of the conference. It's tapering off now. Sigh.

Comments are off for this post