When Nitter first came around, it was incredible. It really goes to show just how easy it would be for Twitter to provide a fast and accessible version of their site if that was their goal. Too bad it isn’t and Nitter instances will forever run into rate limiting.
Ditto for YouTube - I remember being able to watch videos on it 10 years ago, with even older hardware and software, and the site was far more responsive than it is now.
Yet it seems if you tell the average web developer that those sites should be accessible with software and hardware from that era, or even older in the case of Twitter (rendering short snippets of text and some small images doesn't require the latest technology, seriously), they'll become confused or indignant, or try to use some absurd argument to justify it.
Twitter and YouTube are great examples of a lot of the disturbing trends in web development today. Whatever it is, it's not "progress"!
I don't "contribute" to either one, nor do I have an account; I just go there to read or watch what others have contributed.
It means using the "guest token" (gt) for sending GraphQL queries. The gt is public and is distributed via the twitter.com public web page. The gt is the same for every member of the public.
To see the gt, read the contents of twitter.com
For example, using Chrome
chrome view-source:mobile.twitter.com
Ctrl-F gt=
or curl
curl https://mobile.twitter.com|grep -o gt=[0-9]*
To retrieve tweets three steps are required: 1. retrieve the "guest token" (gt), 2. retrieve the "REST ID" (rest_id) for the twitter.com user and 3. submit a GraphQL query to retrieve the user's tweets.
Javascript provided by Twitter in the twitter.com can do these three steps automatically (but this requires using a Javascript-enabled browser), or it can be done without a browser, e.g., with a different scripting language (personally I use the shell).
But there must be some kind of IP based rate limiting or something, right? At least for DDoS protection even if it's not intended to prevent scrapping.
Never have I been more happy to be wrong. It used to be impacted by rate limiting quite a bit, but it looks like the situation has improved, probably due to this unofficial API.