Automating cut and export of data

Home Forums Advanced usage Automating cut and export of data

Tagged: 

This topic contains 3 replies, has 2 voices, and was last updated by  Sep 2 years, 8 months ago.

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #806

    Sep
    Participant

    Hi Martin,

    Thanks for putting this tool together – I can see it becoming very useful to my work.

    I am pulling Tweets with a hashtag that is being used about 1000 times an hour, and I would like to retrieve every Tweet using this hashtag over a given period. I will not be able to manually cut and paste the Tweets each day before they hit the 18000 cap. Is there any way to automate this process? I.e. cut everything from line 3 down (so the Last Tweet field remains accurate) and send it somewhere else? Or can you think of a way around the cap?

    Thanks in advance.

    #807

    Sep
    Participant

    I’ve just realised that I may have misconstrued ‘Number of Tweets’. Does it refer to:

    a) the maximum number of Tweets that can be stored in the Archive at any one time
    or
    b) the maximum number of Tweets that can be collected with every update?

    If the latter, then I presume the number of Tweets that can be stored in the Archive is significantly higher than 18000, eliminating my problem. I’d be really grateful if you could clarify this. Thanks!

    #810

    mhawksey
    Keymaster

    Hi – it is the maximum number of tweets that can be collected with every update. Google sheets does have a 2million cell limit so with the default number of columns you’ll start hitting the limits after a couple of days.

    Your original request is possibly but would require some custom script

    #811

    Sep
    Participant

    OK! Thank you.

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.