Links (8 July 2024)
– Baldur
Bjarnason
- “Rust has a HUGE supply chain security problem”. “Any of these dependencies can compromise not only the final program, but also the developers’ workstations, Continuous Integration systems and more.”
- “Microsoft tells more customers their emails have been stolen • The Register”. “It took a while, but Microsoft has told customers that the Russian criminals who compromised its systems earlier this year made off with even more emails than it first admitted.” If it seems like I’m always saying that Microsoft is really really bad at security, that’s because they’re always fucking security up.
- “Making an image with generative AI uses as much energy as charging your phone | MIT Technology Review”. “Luccioni says she hopes the research will encourage people to be choosier about when they use generative AI and opt for more specialized, less carbon-intensive models where possible.” ‘Use smaller, more specialised models’ was literally one of the main recommendations I put in my book a year ago 😎
- “Leaving and arriving | everything changes”. “One thing is abundantly clear to me: the skills that made people good at building websites are readily transferable and even sought out skills in other fields.”
- “Minutes to Midnight - Leaving the web industry”. This is a happy story, not a sad one.
- “On AI and the commoditisation of design – Scott Riley”. “That’s because I know the value good design can bring to a project, and it’s not the output. Design is about humans, about sense-making, systems thinking, and craft.”
- “It’s about time I tried to explain what progressive enhancement actually is - Piccalilli”
- ‘“Technical” skills’ “They’re simply the skills used to produce the work.”
- “The Expense of Unprotected Free Software - ACM Queue”. “Fixing this is going to be messy, expensive, and slow. Yet we have no choice but to take on the challenge since we now know that—with stakes this high—a saboteur must be willing to spend at least three years taking over as a maintainer of XZ in place of Lasse.”
- “Forget “show, don’t tell”. Engage, don’t show! • Lea Verou”. “It highlights how essential it is for students to a) understand why what they are learning is useful and b) put it in practice ASAP. You can’t retain information that is not connected to an obvious purpose.” As Dewey pointed out over a century ago, what you do is what you learn.
- “How Did Silicon Valley Turn into a Creepy Cult?”. “In short, megalomania has gone mainstream in the Valley.”
- “Declare your AIndependence: block AI bots, scrapers and crawlers with a single click”. Not a huge fan of Cloudflare but them offering this service is both an indication of the scale of the abuse (this level of traffic costs money) and the changing attitudes towards “AI”.
- “Google and Microsoft are getting dirtier. They can’t help it, it’s AI! – Pivot to AI”
- “The Iceberg Model: towards unraveling our patriarchal legacy”. “It seems to me that creating software is much more like playing in a jazz ensemble or growing a garden.”
- “Generative AI is a climate disaster”. Also note that these are 2023 numbers. If anything it’s escalated this year.
- “Why Zig has become the highest-paying programming language”. I did not know that Zig was managed by a not-for-profit foundation.
- “Dynamic Type on the Web • furbo.org”. “This site now supports Dynamic Type on iOS and iPadOS.” Did not realise this was possible.
- “Human Who Codes Newsletter - Node.js, Deno, and Bun”. “That’s why it makes sense to preserve the ability to switch runtimes easily.” What I’d add is you should use standard web APIs wherever possible. That opens up the opportunity to use your code in browsers or CloudFlare Workers. Web APIs are also the only properly standardised and defined APIs available in JS, which means they aren’t going to change on you.
- “Open Source Software: The $9 Trillion Resource Companies Take for Granted - HBS Working Knowledge”. “Many companies build their businesses on open source software.”
- “OpenAI breach is a reminder that AI companies are treasure troves for hackers | TechCrunch”. This is, what, the third or fourth OpenAI security incident in less than two years? Like I keep saying, this is the sort of clown car you get if you’re all-in on using LLMs for coding.
- “GitHub - explainers-by-googlers/prompt-api: A proposal for a web API for prompting browser-provided language models”. No nononono no NO nope NOPE. Fucking hell no. JFC is this week’s reading off to a bad start.
- “GitHub Copilot Extensions are all you need”. “In this post, we’ll look at the new APIs that empower extensions to interact directly with Language Models and the Chat experience contributed by GitHub Copilot.” The other day I wrote about how VS Code extension were a security nightmare and that it’s highly likely they’re being exploited. Turns out Microsoft thinks they can do even better by giving extensions access to the security shitshow that are LLMs.