Hello Subscribers, New and Old.
Welcome to Somethings, your weekly dose of highlights, quotes and notes from my notebook. If you would like to receive this in your inbox, subscribe now. If you want to support, do checkout the links in the Friends of Somethings Section.
📕Something to read
- The 40 years of the CD: A great retrospective on the first mainstream digital media format. As a person who had a minor contribution to this history, I’d say it is quite comprehensive account of the legal side of CD distribution.
📺Something to watch
Probability Paradox
Probability is a tricky subject to get your head around. In this video, a controversial probability problem is posed to the audience. The problem is not limited to the domain of statistics. It also overlaps with language, syntax, expectation, and how large numbers can skew everything. Even in the comments the controversy continues. For more on Probability check out this video on Bayes Theorem.
Slanguage
A great video on how slang terms die. tl;dw: It is because stuffy corporate types coopting the terms destroys its cachet.
Friends of Weekly Wisdom
- Refind: The essence of the web, every morning in your inbox. Tens of thousands of busy people start their day with their personalized digest by Refind. Sign up for free and pick your favorite topics and thought leaders. https://refind.com/?utm_source=newsletter&utm_medium=barter&utm_campaign=FU-SmtfFzzhQJDgFEz5eiw
- The Sample: The Sample lets you try the best newsletters based on your interest. With one-click you can subscribe if you like.
🗣Some Quotes and Notes
Bigger Winner Loses
A succinct quote on the nature of the Rat Race.
The problem with the rat race is that, even if you win, you’re still a rat, which isn’t the kindest phrase, but it has some truth to it.”
—Luke Concanon(via Sean Cole), What It’s Like Being a One-Hit Wonder
Recreating Reality
Ted Chiang, writer extraordinaire, summarizes my issue with Large Language Models like GPT3(of ChatGPT). LLM’s are just performing regression analysis on a time-series. It can only produce weak copies of previous materials. As one researcher put it, they are just Stochastic Parrots.
This analogy to lossy compression is not just a way to understand ChatGPT’s facility at repackaging information found on the Web by using different words. It’s also a way to understand the “hallucinations,” or nonsensical answers to factual questions, to which large language models such as ChatGPT are all too prone. These hallucinations are compression artifacts, but—like the incorrect labels generated by the Xerox photocopier—they are plausible enough that identifying them requires comparing them against the originals, which in this case means either the Web or our own knowledge of the world…if a compression algorithm is designed to reconstruct text after ninety-nine per cent of the original has been discarded, we should expect that significant portions of what it generates will be entirely fabricated.
—Ted Chiang, ChatGPT Is a Blurry JPEG of the Web
Platform are Dead
I have written extensively about how search is well and truly dead. The Internets Crypto-Zaddy, Cory Doctorow explains how similar fates have befallen all platforms.
That’s when Amazon started to harvest the surplus from its business customers and send it to Amazon’s shareholders. Today, Marketplace sellers are handing 45%+ of the sale price to Amazon in junk fees. The company’s $31b “advertising” program is really a payola scheme that pits sellers against each other, forcing them to bid on the chance to be at the top of your search.
—Cory Doctorow, Tiktok’s enshittification
Thank you for joining me this week. If you know some who might enjoy this, please forward this email to them. See you next week.
Mudassir Chapra