Probably because they weren't well supported in the West at the time and Apple didn't want people sending messages and emails from their iPhone with missing glyphs. It's one of those things you need a consensus for, both the sender and recipient have to support it. It's not something like iMessage where Apple only has to worry about their own ecosystem.
They weren't standard yet. Sending messages containing Unicode Private Use Area glyphs that not even other Apple products could display, let alone other vendors' (non-Japanese) phones, would be an interoperability nightmare.
Iraq was much more popular in 2003 [1] than the current war in Iran is [2].
[1] "In the months leading up to the war, majorities of between 55% and 68% said they favored taking military action to end Hussein’s rule in Iraq. No more than about a third opposed military action."
[2] "Some 27% of respondents said they approved of the strikes, which were conducted alongside Israeli attacks on Iran, while 43% disapproved and 29% were not sure"
hey, it's Samip (behind the Slowrun repo). yeah that's a fair point, we will mention them in the blog. but there are a couple of major differences:
1. our emphasis is on using more compute to get better data efficiency. this is important because there are lots of hacky chances that will get lower loss, but when compared to general methods that leverage a lot of compute, they don't do so well. and you can already see how this emphasis on compute leads to different methods to BabyLM!
2. our reasoning behind the repo is not anything to do with how much data a child sees. and our dataset is not tailored towards that either. it's simple pretraining on random subset of the internet. we know there are better training algorithms that get lower loss on that data, and we are finding those.
There might be boots on the ground eventually given Trump's speech.
>The lives of courageous American heroes may be lost and we may have casualties. That often happens in war, but we’re doing this not for now. We’re doing this for the future, and it is a noble mission
Iran is hitting back at US bases so it could be related to those risks, rather than a full invasion.
(Crazy idea, maybe the people shouldn’t be left in the dark about their government’s war plans by having a deliberate legislative body debate and vote on it)
It's a sinister statement, but despite everything the U.S. has moved to the region, they didn't move the stuff they would need to move for ground operations.
Seems like it could be problematic in the future since code can't be fixed by humans so the only source of future code for training is unedited Claudeslop.
Just pay another few thousands to fix that bug. When all devs are gone and you only have AI, a small bug to fix will be charged at premium rates by Claude.
He will say: the cost of this big is costing you 100k a year. We will fix for 10k one time fee.
Or they communicate in languages we cannot understand.
Even among human languages the sounds of some languages sound all the same to humans who are not native speakers of that language.
Chinese for example has a million words that all sound like "shi" and other tonal languages like Vietnamese are also indistinguishable to English natives etc. Japanese people treat R/L the same.
Elephants and dolphins have been known to assign unique names for each other.
Octopuses and other cephalopods communicate by changing the color of their skin, EVEN WITH SOME OTHER FISH! BBC's Blue Planet has an episode where an octopus and a grouper fish coordinate via color to trap prey.
Ants and other insects communicate via pheromones and "smell".
Are you seriously going to stick to a human-chauvinistic stance that only we have a "language"?
"For over two decades, Professor Toshitaka Suzuki dedicated his life to studying the Japanese tit — a small songbird native to Japan’s forests. Through years of careful observation and experiments, he discovered something incredible: these birds use grammar-like rules and combine sounds to form meaning, much like how humans use language."
I'm familiar with this case. The "language" of the birds is so profoundly primitive (it's limited to 2 word combos where the meaning is just the meaning of both words). Here's a good blogpost about it.
If we're going to be able to have a meaningful discussion on this, first you will need to provide for me the definition of language under which you're operating.
I mean there are space physicists who don't understand dark matter, etc.
I think this is a "qualia" issue: Like for example biologists can find out what kind of light frequencies the eyes of a mantis shrimp can receive, but we'll never know what it FEELS like to be able to see a zillion times more colors.
You can see this happen with human languages too: Ever walk around in a different country? your brain doesn't even register the sounds other people are making.
It turns out that the fact that mantis shrimp have 12 different color receptors in their eyes means they can see... 12 colors. They can't combine the input from the different color receptors into a spectrum like we and other vertebrates can. Their eyes even perceive different things in different regions of the compound eye. It's a surprisingly limited visual system for all its supposed extra capabilities compared to ours, which to your point makes "seeing like a mantis shrimp" even more inscrutable from our POV.
For anyone else whom the above awnsers absolutely nothing without googling what defines the boundary - A more verbose version of the above comment is that they communicate only simple, situational signals (like warning cries or information for action) and not using a symbolic, rule-governed system capable of abstraction, past and future tense, and infinite combination.
Of course, with all generalizations, this is sort of a lie, but no - whales, chimps and cephapods don't meet the official bar.
Free for 6 months after which it auto-renews if I recall correctly.
reply