Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When Nitter first came around, it was incredible. It really goes to show just how easy it would be for Twitter to provide a fast and accessible version of their site if that was their goal. Too bad it isn’t and Nitter instances will forever run into rate limiting.


Ditto for YouTube - I remember being able to watch videos on it 10 years ago, with even older hardware and software, and the site was far more responsive than it is now.

Yet it seems if you tell the average web developer that those sites should be accessible with software and hardware from that era, or even older in the case of Twitter (rendering short snippets of text and some small images doesn't require the latest technology, seriously), they'll become confused or indignant, or try to use some absurd argument to justify it.

Twitter and YouTube are great examples of a lot of the disturbing trends in web development today. Whatever it is, it's not "progress"!

I don't "contribute" to either one, nor do I have an account; I just go there to read or watch what others have contributed.


Page says “Uses Twitter's unofficial API (no rate limits or developer account required)”. What is this unofficial API?


It means using the "guest token" (gt) for sending GraphQL queries. The gt is public and is distributed via the twitter.com public web page. The gt is the same for every member of the public.

To see the gt, read the contents of twitter.com

For example, using Chrome

    chrome view-source:mobile.twitter.com  
    Ctrl-F gt=
or curl

    curl https://mobile.twitter.com|grep -o gt=[0-9]*
To retrieve tweets three steps are required: 1. retrieve the "guest token" (gt), 2. retrieve the "REST ID" (rest_id) for the twitter.com user and 3. submit a GraphQL query to retrieve the user's tweets.

Javascript provided by Twitter in the twitter.com can do these three steps automatically (but this requires using a Javascript-enabled browser), or it can be done without a browser, e.g., with a different scripting language (personally I use the shell).

For example, some folks use Python:

https://pypi.org/project/twint/


Wow!

But there must be some kind of IP based rate limiting or something, right? At least for DDoS protection even if it's not intended to prevent scrapping.


I would guess that it's some kind of reverse-engineered internal API that official apps use.


Never have I been more happy to be wrong. It used to be impacted by rate limiting quite a bit, but it looks like the situation has improved, probably due to this unofficial API.


The old rate limit errors were false flags caused by a bug in the API token parser. Nitter instances have yet to run into a real rate limit.


I see. Thanks for the information.


> how easy it would be for Twitter to provide a fast and accessible version of their site if that was their goal

You can spoof your useragent to a mobile handset and visit https://mobile.twitter.com/ if you like a more bare-bones Twitter interface.


Yes, but as of last December (I believe Dec. 15), it still required JS to be turned on.


>Too bad it isn’t and Nitter instances will forever run into rate limiting.

Not if you run your own for yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: