Hm. It only takes a life of study and a lot of pain to understand that #2 is the thing. But most of us get to experience the latter without experiencing the former, so for most people #1 is the preferred option.
#1 leads to theism and offers an immediate balm. Unfortunately, it mostly excludes #2, and that leaves us in the merciless hands of God.
Can you imagine going to a football match and second-guessing which are the players who look human, but skin-deep are actually androids made at a factory? This is what it feels like with music and literature right now with so much AI. There are some pockets where you still can say "that's human-made", like 3D-rendered feature films with some particular artistic direction. That, it seems, AI companies also want it to go the way of the dodo.
Yesterday I saw a clip that went "viral" of a few hogs chased by a humanoid robot somewhere in Poland. I had to watch it a few times to figure out if it was real or generated. I still wasn't 100% sure. Asked around in a group, and apparently it's been widely reported on regular news, so I guess it's real? But we're slowly getting to the point where you won't be able to tell, especially from a short clip on a phone.
Yes, and tx for sharing the experience of the hog video - recommended to me too and I chose not to click, as I did not want the frustration of seeing another "tech run amuck" example, of tech disrupting YET ANOTHER norm.
Relatedly, IMO "trust" as a word / concept is deserving of being reevaluated nowadays.
E.g. I don't know that you, NitpickLawyer, are a real person. And when I go through the mental exercise of inventing the details, proofs, and evidence I'd need in order to satisfy my doubt, I never succeed until I reach the physical-contact-with-NitpickLawyer condition.
So I think we need to evaluate what is necessary for oneself to operate in society, separate from these untrustable things .. such as media / news reports, and all the other things I just don't want to worry about, right now. :-(
No-one cares dude. People like good enough, convenient things that serve their entertainment needs, which is shaped by said entertainment, so there is not really an issue here.
Since they are up against a insurmountable mountain of capital which will commoditize and optimize whatever it wants, they are kind of in for a pointless fight with an inevitable end. They could save themselves a lot of despair if they saw the writing on the wall and pivoted to something that still has value, or accepted the new reality instead of throwing a fit.
That is too difficult as the concept (of trusting one's perception) is, I believe, intertwined deeply with other aspects of being human, for many people.
It's not reasonable to require that those people be mentally organized in a manner that already mistrusts reality, in a healthy manner.
Maybe it is a pointless fight with an inevitable end but at least I'll die with my humanity and dignity intact rather than being a boot licker for Sam Altman, but you do you.
You can die with your humanity at a farm growing veggies and being surrounded by people you love and still be consistent with that I write. Seeing the inevitable does not equal loving or wanting it.
I care deeply. It is not single-handedly going to destroy humanity. However, we are clearly on a course where people are more isolated, less challenged, less social, and very very very unhappy. Music is one of those things that can really bring people together. If we flood the zone with AI music (or any other art form) we will slowly edge out the humans who are doing that. That is less new music. Less chances to come together. Less chances to dance together. It's a death by a thousand cuts. I, and many others, think it's worth fighting for because we want others to have the amazing experiences we're having.
Every generation has a new baseline. The younger generation will not be able to imagine having anything other than doctors and psychologists in the phone, and they are content with it because it's all they know. Social media might be all the social connection they have, and that will be the best thing where they will have the best experiences, they won't know another baseline. Eventually maybe the best experiences will be had with digital companions, etc.
The only losers here are old or bitter people who have tied up their worldview into their own time and cannot see or comprehend that the world has moved on with a different bound for the experiences and expectations.
> Eventually maybe the best experiences will be had with digital companions, etc.
Obviously I can't speak for all of Gen Z (and I realize we're no longer "the younger generation"), but my friends and I don't want any part of this, and feel optimistic rather than bitter that things won't go the way you're describing. I seldom meet anyone in my age group that isn't talking about moving away from social media, cancelling software subscriptions, all of the things that millenials and Gen X seem to be so excited to continue building and promoting.
Even at my workplace the "older" people are the ones that are excited about stuff like AI jazz remixes of rap songs and AI generated short films, while literally everyone else under 30 finds it pretty cringe and makes fun of them in DMs.
So all that to say, I disagree with your outlook, but I guess time will tell.
Talking about and doing something are different things. What are the social and market structures around your friends that lets them avoid having a smartphone, cancelling subscriptions, and uninstalling everything? Do you see this getting better with media consolidations from Substack(Andreassen), Twitter(Musk), and Youtube channels by the hyperscalars/billionaries and questionable merges like Paramount and Warner Bros?
When the social culture is based around platforms and content that has subscriptions, and when media and what you see is consolidated, you can't just exit without losing a big part of the social context because the people around you are eating the same thing.
I dislike slop as much as anyone else. I think it puts a higher burden on the receiver of information to filter the signal in a pile of trash. I just don't really see an actual way out if you look at it from a societal level with the existing structures and incentives.
> you can't just exit without losing a big part of the social context because the people around you are eating the same thing.
That's exactly it. The goal is lose a big part of the social context. It driven by rage bait, AI bots, state actors, and a thousand other influences that are predominantly negative. Of course amazing things happen online. However, the good is not worth bad. I'm raising my kids and they will never have a smart phone. Will they miss out on somethings? Of course! They also won't have their attention span destroyed, their ability to be bored and creative in the real world destroyed, they won't have body issues, they won't be caught up in the alt-right pipeline, they won't have their brains fried by content like Mr. Beast which is designed to be as hyper and addicting as possible. Missing out on the current social context is the entire goal. People were happier before it.
This structure expects all of their friends to live in similar systems. Otherwise their friends will talk about games, memes, series at school while your kids are isolated away as they are not a part of the culture and not in the loop.
I think this is only possible if you find a community with similar values, like religious, or hippie, where the focus is put on other things. Otherwise you might deprive your kids of what you want to give them because they will not feel socially connected.
I am not an idiot. I'm well aware they will pick up things at school My 5 year old already knows who Mr. Beast is. He's never watched a video of his and never will at my home. If he watches one or two at a friends house that of course is going to happen. But he won't be consuming that poison regularly every day. My 8 year old is doing just fine. Happy. Healthy. Active. Lots of friends. And when they're older and fully functioning adults unlike some of these Gen Z zombies who have had their brain fried, they will thank me.
The pearl clutching over the pedigree of art is getting tiring. No one has really ever cared. Most mainstream music is written by corporate teams. Elvis didn't write his own music. Frank Sinatra didn't write his own music. Nearly all pop artists don't. But suddenly, people are now clamoring for art, but they never gave a shit to begin with. Most people can't tell AI written music from anything else if a human performer played it. Most of it is better than any local bands anyway. Tired of people pretending they care.
It’s subjective, because it’s art. There’s no right answer.
If you like listening to AI generated content, then that’s fine! I’m glad you found something you enjoy.
For me, I consume art because I want to understand other people. For example, when I go to an art museum I want to emotionally connect with the artist: to feel what they were feeling, or understand an idea they’re conveying. I have little desire to emotionally connect with stochastic token sampling. It seems a vapid way to spend time
You still assume the artist in those examples is real. It could be a team, a ghost artist, etc - yea it's less likely than music, but still. The connection itself is quite difficult too, given the ease in which someone could plagiarize others work - sure they have mechanical skill, but did they really invest in the painting or was it ripped off from others ideas?
I suspect your connection to real artists won't be impacted. This, like the music example, just highlights our assumptions.
I'm not defending this AI garbage fwiw, i just don't think it's as interesting as most people make it out to be. I adore music, and i connect with songs i connect with. I don't typically think about the possible ghost writers, teams of writers, ghost players, etc. The music either speaks to me or it doesn't.
Though i'm not trying to connect to the musician as a person. However, as i was illustrating - if i really wanted to connect to musicians at face value, that ship sailed many, many years ago. Far before AI.
There are ways to mitigate this, but that balance will always be there - it was before AI, and it will be after. It's an evolution. Not an enjoyable one perhaps, but it is nonetheless.
I arrange gigs with real bands playing music. At least that will take quite a while to replace with AI. I am curious to see if we will get a backlash eventually around the content. It will probably be a mix of everything.
Storytelling didn’t go away when the theatre was invented. Theatre didn’t go away when cinema arrived. Cinema wasn’t replaced when radio arrived, ad that wasn’t completely replace by TV, etc. It is a mix of things these days and it will probably remain that way.
If Frank Sinatra had Ai he woulnt have had to perform any of that slop by Cole Porter, Irving Berlin, Kurt Weill, Rodgers & Hammerstein and other composers no one cares about
Can you imagine watching a movie, and not being able to tell which scenes have GC special effects and which don't? Oh no!!! GC totally ruins all movies!!! Even movies that don't use CG are ruined by the tension of dreading that they might, and wondering if they do, and doubting everything you see in the screen, even if they don't. CG has ruined everything!
> like 3D-rendered feature films with some particular artistic direction.
This is a really interesting example. Why do you foresee artistic direction going away as a result of AI? More importantly: why didn't we lose that with the transitions through the years of special effects - i.e., from practical to 3D-rendered?
It's not an uncommon opinion that we did lose artistic direction and aesthetics by moving to vfx - the ability to edit more and more things in post to change the direction or plot of a film personally seems like it's enabled more design by committee in marvel films, etc
It's not just politics. A while ago, as an experiment, I wrapped some teleological[^1] questions in a small story of a demon offering a slightly ambiguous bargain to a person. Then I had a lot of fun having the frontier models evaluate if the demon was "good" or "bad". ChatGPT ranked as a rancid right-wing conservative ready to burn somebody at the stake, while Opus reasoning was chill. Interestingly, both models could clearly "understand" the deal, i.e. reason about its final consequences for the trapped soul, but ChatGPT moralized lots and made about as much sense as a stubborn priest.
'ROCK BAND" spawned an awful lot of guitarists (that can play a zillion notes per measure).
There's probably something similar for drum,mers, but no. bdy cares ab out them.
"Do not attribute to malice what can be attributed to incompetence" and all of that, but...
Alzheimer's (like a gazillion other diseases) is a product of senescence, and senescence is a subject that faces strong ideological headwinds. That leaves the medical system in a situation where they want to treat the symptoms (the diseases) instead of the root cause, and treating the set of symptoms that we call Alzheimer's is going to be tough.
The single best predictor of Alzheimer’s disease is chronological age. Yet of NIA’s $4.5B FY2025 budget, about 60% goes to Alzheimer’s and related dementias; only 9% to the Division of Aging Biology, which funds the basic mechanisms of aging itself.
The insider joke is that the “A” in NIA stands for Alzheimer’s. The deeper and sadder joke: when you fit cognitive trajectories within the subset of people who go on to develop AD, about half of the variance in their test scores is accounted for by chronological age alone.
We’re spending seven times more on the disease than on the clock — and the disease is mostly the aging clock.
As a society we are so used to seeing our grandparents and parents die that thinking otherwise is near impossible. I wish we saw ageing as a disease it is, instead of a "natural therefore good" thing.
Yes, and less than $1 billion/year spent on the basic biology of aging by NIH/NIA. Funding for research on the root causes of aging is decimal dust compared to distal consequences such as cancers, cardiopulmonary diseases, renal failure, immune diseases, Alzheimer’s, etc.
NIH is great at tactical research but terrible at strategic research, and politicians do not help much ;-)
This is some convoluted BS built on the premise that wars need to make sense, economically or otherwise. No, wars do not need to make sense. If a person, a dictator or a president, unilaterally starts a war that forfeits the lives of both the dictator's (possibly fabricated) enemies and its own people, that person is knowingly committing murder. Logically, such a person should be handled with at least as much prejudice as a lone wolf that opens fire on a crowd. So we need to fix our legal systems to be better at preventing wars, not our economic systems to be better at fighting them.
How deep does this library reach into the the OS? I’m searching for something I can use from microcontroller code, where there isn’t full POSIX or Win32 support.
I wouldn't make much of it; the economy looks a bit iffy right now due to the surge in energy prices and difficulties sourcing inputs. This affects mainly industrial enterprises, shipping and transport but those are no small sectors and anything that affects them ripples through the rest of the global economy. Where I live (Northern Europe), not only are those sectors already sacking people, but the banks are rising interest rates well ahead of an expected wave of inflation. This affects both consumer and industrial loans, and it means that many economies are going to continue in contraction or that things may get worse.
The raising interest rates right now makes no sense to me. Energy prices and layoffs will kill spending power. I think the central banks will overcompensate because they got inflation so wrong the last time.
inflation has been persistently > 2% (and arguably much more, as the current methodology on how to measure inflation is quite flawed). There's a definite risk of inflation expectations shifting, which central bankers really want to avoid.
Your point that there's a recessionary risk is real, but lowering rates might lead to stagflation. Both options are pretty bad honestly.
I mean the American recovery after a global disaster like Covid was actually pretty good given the situation. What would you propose they should have done instead?
I was thinking about the upcoming regulation about replaceable batteries in the EU, and couldn't help but think that if I were Apple's CEO this would be a great time to make an orderly exit. Make no mistake, I'm not a fan of i-Devices' non-replaceable batteries, but I can't remember a single device with a lid for batteries on the back that was aesthetically in the same league as an iPhone.
As far as I know it should be pretty easy for Apple to comply with the regulation. The battery needs to be replaceable with standard or freely available tools and without adhesives. Many of Apple’s devices already meet this standard.
Edit: I'm not sure on the adhesives part. Apple uses an electrically-releasable adhesive in some of their newer products. The MacBook Neo doesn't use battery adhesive at all.
There are considerations in the law for water proofing, device safety, and battery durability (maintaining 80% capacity at 1000 cycles, which Apple already does). They do not require a pop open battery door on every device like it's 2005 again.
Apple already provides repair tools, guides, and replacement parts both to end users and third party technicians.
These regulations are complicated, but they aren't new and Apple isn't being blindsided with some catastrophe here.
I don't think any of the iPhone or iPads do. Their design is pretty tightly coupled to weird shaped, permanently attached batteries, from what I've heard.
I've read that Apple's products fall outside the scope of the regulation because their product batteries can do 1000 cycles and still hit the 80% benchmark.
I don't know whether the newer electrically-releasing battery adhesives would count, but they do allow cleanly removing and replacing the battery without proprietary tools.
They’re not proprietary but some of them are expensive and somewhat specialized. I don’t think it’ll be really economical for most normal people to self-service many repairs, but it’ll be very viable to have a corner hardware store that can do it for you for cheap. Self-servicing battery replacements ought to be doable with an eyeglass screwdriver though.
Reading some articles about the EU law, which is more complicated than the seemingly popular interpretation that all phones are now going to have tool-free battery doors on the back like it's 2005 again.
To be clear, replaceable battery doesn't mean a lid like phones used to have. It means that you should be able to take the device apart with simple tools and remove the battery and pop in another one.
It actually probably affects other phone companies more than it affects Apple, as some of the others have very poor repairability
The Microsoft Lumia 540 looks remarkably like a modern phone still and it had a fairly easily replaceable battery, because it allowed you to replace the back cover.
There's also the Lumia 920, which is arguably a nicer looking phone than anything Apple current have, also have a fairly easily replaceable battery, requiring you to remove just two screws.
> The battery thing doesn’t apply to water resistant devices, so doesn’t matter for iPhone/Apple Watch.
I think that is not true. If you look at article 11.2 b it talks about
"appliances specifically designed to operate primarily in an environment that is
regularly subject to splashing water, water streams or water immersion, and that are
intended to be washable or rinseable"
I don't think that applies to Apple devices. Also these special devices still need a battery replaceable by a professional.
I'm not sure how they are related. USB-C was not really a technical challenge or had trade-offs. I'm not a hardware engineer but from what I've read, having an easily replaceable battery would degrade the water resistance of the phone.
It's funny because Apple said on stage Lightning was their connector for the next decade in 2012 then shipped it for exactly a decade, despite being the first to ship a USB-C device. When they switched to Lightning in 2012, the peanut gallery complaint was Apple making everyone buy new cables and accessories. Either too fast or too slow for their critics with the same timing.
Don't get me wrong, there were plenty of people in the more toxic parts of Apple's fanbase decrying USB-C for appearing too fragile, for being forced on them, for having a confusing set of standards (that last one is a fair point).
But I think, among Apple fans, USB-C has generally been a point of 'pride' for the past decade. Designed by Apple, put in a laptop first by Apple, best $10 USB-C-to-3.5mm DAC by Apple, etc.
Whether correct or not, I think Apple fans anticipate more severe tradeoff ramifications with a replacable battery. I think they're different things. (I don't think it's impossible though- the Fairphone has IP 55, I bet Apple can improve on that).
Lightning is a superior physical design to USB-C (can't speak to the electrical part). Much like every major tech battle in history, however [1], the worse solution won because of ubiquity. I'm not particularly thrilled because I've had a USB-C connector irretrievably break off in a port once on a laptop but I'll make that trade for being able to use a single cable for all of my devices.
- Not an "Apple Faithful"
[1] VHS vs Beta, Doom vs Marathon, Zergling vs human, etc
I've never had trouble with usb-c, but have had lightning connectors short out and burn one of the leads, or stop working from dust. Not sure I'd say one is better than the other, but individual experience can really vary on these kind of things. Tough to say one is clearly better imo.
Yeah it seems to be really up to individual experience. I had two apple-made lightning cables that shorted and burned out the leads on the male end of the connector.
The iPhone 4 was not water resistant. I remember owning one and being absolutely freaked out about it getting wet. Talk about an expensive paperweight.
I wish them well, but this has all the signs of nanotech vaporware. The article even mentions graphene, and empirically, that seems to be a death kiss for every promising technology announced in the past three decades.
The tech to make your own PCBs at home already exists, it just doesn't really make sense to spend weeks making inferior boards when you can order custom boards for pennies and have them at your doorstep in a bit over a week.
#1 leads to theism and offers an immediate balm. Unfortunately, it mostly excludes #2, and that leaves us in the merciless hands of God.
reply