> as building software gets easier [...] new entrants to displace Bad Old Software
This didn't happen for music.
It is much easier to create/record music today than in the 70s and 80s, but the music created today is mostly boring AI music and not new exiting/inventing music.
A lot, the period from National Romanticism onward is the most relevant for any form of study of a "nation".
Before that you could only think in terms of loosely connected realms/kingdoms, before more in terms of tribes and some city-states. Those aren't that useful to study to understand the present, from the 17th century is where most of the current culture branched out from.
The historical connection to the land from the people/tribes living in territories of modern Europe from before the Middle Ages is more akin to studying Native Americans in the USA, they were the people inhabiting the land, they had their traditions, and some of those traditions were used to forge the national identity of present cultures but there's a lot of this national identity that was myth-making by National Romantics to generate a sense of unity needed for creating the nation and nation-states.
A lot! The majority, surely. The period from the 18th through the close of the 20th centuries was a time of tremendous upheaval, where nations were forged. German students, for instance, don't spend all of their time on the HRE; they tend to focus more on the nation-forming events of the 18th and 19th centuries, and then of course the 20th.
Of course, students also learn ancient and ancient-adjacent history -- the Mesopotamians, Egyptians, Greeks, Romans, Charlemagne, etc. -- but this is general and isn't unique to any national tradition, but common to the entire continent.
It's not new. BOMStore is a format inherited from NeXTStep. JSON didn't exist back then.
Also, it's a format designed to hold binary data. JSON can't do that without hacks like base64 encoding.
Binary file stores like this are very common in highly optimized software, which operating systems tend to be, especially if you go looking at the older parts. Windows has a similar format embedded in EXE/DLL files. Same concept: a kind of pseudo-filesystem used to hold app icons and other resources.
>Looks very much like a format that should just have been gzipped JSON.
For application file formats that require storing binary blob data such as images, bitmaps, etc , many in the industry have settled on "SQLite db as a file format": (https://www.sqlite.org/appfileformat.html)
Examples include Mozilla Firefox using sqlite db for favicons, Apple iOS using sqlite to store camera photos, Kodi media player uses sqlite for binary metadata, Microsoft Visual C++ IDE stores source code browsing data in sqlite, etc.
Sqlite db would usually be a better choice rather than binary blobs encoded as Base64 and being stuffed into json.gzip files. One of the areas where the less efficient gzipped JSON might be better than Sqlite is web-server-to-web-browser data transfers because the client's Javascript engine has builtin gzip decompress and JSON manipulation functions.
My favorite example of this is Audacity, which stores their entire project file, including all the audio data, as a SQLite database. It's really amazing, you can stream audio data from many input sources into a SQLite database at once, it wont break a sweat, it's flexible and extremely safe from data loss due to crashes or power outages. As well as trivially mutable: many of these kinds of formats the "export" is a heavy-duty operation which serializes everything to JSON or whatever. But in SQLite, it's just a question of updating the database with normal INSERTs, very cheap and simple operations. We've put a similar system into production, and it's incredible how well it works.
Strong disagree. I like binary formats because I can just fopen(), fseek() and fread() stuff. I don't need json parser dependency, and don't need to deal with compression. Binary formats are simple, fast and I need a way smaller buffer to read/write them normally. I don't like wasting resources.
Binary formats (that you can just fopen(), fseek(), fread() etc.) are generally super platform dependent. You can read in an array of bytes and then manually deserialise it (and I’ve done this, for sure!) but that’s basically one step away from parsing anyway.
Hazard a guess that serializing demented data structures into something text encoded like json or yaml or XML/SOAP is no less painful than a straight binary representation aside from unfamiliarity of tooling to reason about and arbitrarily query the structure, like jq, lq, etc.
Every format is binary in the end, you are just swapping out the separators.
I personally subscribe to the Unix philosophy. There are really two options, binary or plain text (readable binary due to a agreed standard formatting). All other formats are a variation of the two.
Additional a binary format suits makes sense when the format is to be used for mobile devices which may not have much storage or bandwidth.
JSON is… adequate. I like binary formats, too, when doing low-level programming, but you often want something readable, and JSON is a lot better and easier to parse than many other things, say XML.
> Don't use binary formats when it isn't absolutely needed.
JSON in particular isn't very good [0] but I'd also argue that text formats in general aren't great. Isn't it a lot better that an int32_t always takes exactly 4 bytes instead of anywhere between one and eleven bytes (-2147483648)?
I would agree that binary formats can be much better. Sometimes you can use fixed fields, but there are also structured formats such as DER (which is generally better than JSON in my opinion). JSON has many problems, including that it does not have a non-Unicode type (so you must use base64 encoding or hex encoding instead and that isn't very good), and other problems. Parsing it will also mean that you must handle escaping.
One use-case of what? Whether a format is "streamable" is entirely orthogonal to whether it is text or binary. Streamability depends on other things such as if it has a header or checksum at the end, et cetera.
Uh. You want to store assets in JSON? Why? You generally want asset packs to be seekable so that you can extract just one asset, and why would you want to add the overhead of parsing potentially gigabytes of JSON and then, per asset, decoding potentially tens of megabytes of base64?
Why not have both options? .gltf and .glb being possible for assets been more than helpful to me more than once, having the option gives you the best of both worlds :)
What's Apple's incentive for having two different asset pack formats? It seems like more work to support parsing and generating both, and Apple expects you to use their tools to generate asset packs.
Working with binary files really isn't that hard. If Apple documented the .car format, writing your own parser wouldn't be difficult. It's not like it's a complicated format. Still, Apple clearly doesn't intend for people to make their own .car generators; to them, ease of reverse engineering is a bug, not a feature.
Easier for the developer or easier for the computer?
Computers need to do it a bunch for every program launch for every single user of macOS for decades. The developer just needed to write a generator and a parser for the format once.
Would it have been a bit easier to write a parser for a format that's based around a zip file with a manifest.json? I don't know, maybe. You'd end up with some pretty huge dependencies (a zip file reader, a JSON parser), but maybe it'd be slightly easier. Is it worth it?
JSON is for humans to read and maintain, not machines to read/write - and machines read/write millions of times more than humans do.
Using JSON / HTTP for backend communication (i.e. microservices) is madness when you consider how much overhead there is and how little value having it be human-readable delivers.
You could make the argument that Patreon isn't much more than a banking app.
It just focuses on the receiver of the money than the sender.
I think Apple is slowly killing apps with this policy. Everybody will slowly move to "web only" as 30% would kill their ability to compete with anybody else. This will likely be much stronger in countries where iPhones do not have the same market share as in the US.
Apple users seem to be fine with everything being much more expensive. Not only the 30% apple tax itself, developers know Apple users pay more and specify higher prices on Apple.
Google allows out–of–store installation (for now...) so it's much easier to argue there's competition. Apps installed through F–Droid don't have this tax.
And in the EU alternative app stores are allowed on iphone as well. In both cases, it’s a near negligible amount of people that use them. Your exceptions prove the rule, if anything.
This would not be true if all markets rose simultaneously. It’s why they all fight so hard to delay the inevitable. You don’t have to win, you just have to win for long enough to be established
> Everybody will slowly move to "web only" as 30% would kill their ability to compete with anybody else.
Frankly, yes, please. I mean, I'm biased as my whole career is in web app development, but there are so many things these days that do not need a whole native app. They're just communicating with a server backend somewhere, using none of the unique native functionality of the phone (much of which is available in browser APIs these days anyway). I can block ads in a web app much more easily. It's much harder to do customer-hostile things like block screenshots in a web app.
Native apps definitely have a place, but I think they're very overused, mostly for reasons that benefit the business at the expense of the customer.
You still can't have a "share to" target that is a web app on iOS. And the data your can store in local storage on safari is a joke.
Of course, forget about background tasks and integrated notifications.
In fact, even on Android you miss features with web apps, like widgets for quick actions, mapping actions to buttons and so on.
And no matter how good you cache things, the mobile browser will unload the app, and you will always get this friction when you load the web app on the new render you don't have on regular apps.
No, I use them but loading and unloading the app in the tab still happens when the browser flushes the app from memory because the OS killed it or the browser eviction policy hits.
This loading is not nearly as seamless as a regular app starting back up.
For a regular app, you have the app loading, and the os cache helping with it. If you do your job half correctly, it loads as a block almost instantly.
For a web app you have the web browser loading, the the display of the white viewport in a flash, then the app loading in the browser (with zero os cache to help with so it's slower). It needs then to render. Then restoring the scroll (which is a mess with a browser) and the state as much as you can but you are limited with persistence size so most content must be reloaded which means the layout is moving around. Not to mention JS in a browser is not nearly as performant as a regular app, so as your app grows, it gets worse.
I disagree, native apps on iOS have important abilities that no web application can match. The inability to control cache long-term is alone a dealbreaker if trying to create an experience with minimal friction.
I think the parent may be referring to the fact that safari/webkit will evict all localstorage/indexeddb/caches etc after 7 days of not visiting a site. And apparently this now extends to PWAs making it a pretty big blog to building any infrequently accessed PWA that needs to persist user data locally.
Those same elevated controls are used to steal PII and sell to data brokers. Again, it's the companies that are trying to force apps on their users. If it were genuinely a much better UX, they wouldn't have to do that.
I don’t think you are correct, but I could be wrong. For example, can you replicate the functionality of TikTok - autoplay unmuted videos as the user scroll down to new videos? It’s the experience that the user expects.
I've probably deleted 15 apps from my phone in the past year as I steadily move over to the web for everything.
My chat agent, file transfer tool, Grubhub, Amazon, YouTube, news, weather are all deleted in favor of a set of armored browsers that suppress the trash and clean up the experience. Its been an amazing change, as those companies no longer get a free advertisement on the application grid of my phone, making my use of them much more intentional.
Sure, once the user interacts with the first video.
If third party native apps were installed and run without user interaction the same as cross-origin redirects, I would expect the same limitations with native apps.
I use FB via my web browser (Firefox on Android) and when I look at Shorts, it has this exact functionality. Web browsers on mobile can do this, clearly.
The Android Browser isn't as crippled as the iOS one. Watch a full screen video on Safari and tap a few times on different places on the screen and you will get a notification about "Typing is not allowed in Full-Screen" or some other nonsense
You couldn't make that argument because Patreon is also a platform to host content, not just send money. If it was something like a twitch donation app the argument would make more sense
The hosting aspect is only necessary because a) piracy and b) Google would eat their lunch if they were the gate keeper to content. Bit like how Ticketmaster takes all the money from artists because they get to say who sits in a seat.
Honestly I wouldn't be that shocked if Apple tried demanding a 30% royalty on bank deposits and bills paid using iPhone apps. They've decided the future of their company depends on being huge assholes about it.
I would be surprised by that because iPhone users would notice that. I think the App Store model relies on their fee being invisible to consumers, and the increased price you’re paying not being linked to them. AFAIK apps aren’t allowed to explain that they charge more if you subscribe on iPhone to users either, or why they do so.
True, hard for bank deposits where the user sees both ends of the transaction.
For bill payments though, they'd just insist on taking 30% of your electric bill payment and if the electric company's margins aren't high enough to absorb that then "Haha that sounds like a you problem" - Tim Cook, probably
While you're correct, it's worth noting that this only happened because the judge in the Epic lawsuit ordered an injunction forcing Apple to allow it.
Apple then "maliciously complied", allowing it while demanding a 27% fee on any web-based payments, which was found to be a violation of the injunction.
> For U.S. fans, there’s still a way to avoid Apple’s fee. When signing up in the iOS app, they can choose web checkout instead of Apple’s in-app purchase system. Apple’s rules require that any paid content shown in the iOS app is also available to purchase through Apple’s in-app system.
When you use Apple Pay, Apple collects ~0.15% (15 bps) from the issuing banks for credit.
$1B in transaction volume = $1.5M
In 2022 the total volume was estimated at $6T * .15% = $9B. Real number would be maybe half due to lower fees on debit, but it's hugely profitable for Apple, and carries zero risk.
I think this is a very strong and simple argument to use with regulators, politicians etc.
When I put my credit card into Apples ecosystem they take a 0.15% cut of the transaction and appear to be very happy with the results. When I put my application into the ecosystem they take 30%..
You can then break down why this is, but boy is that an interesting contrast.
Something interesting is that Apple and Google Pay charge a tiny commission (don't have the number at hand). Which banks didn't like, so at least on Android they created their own NFC payment stacks for a while. Only to then discover that maintaining such a stack cost them more per year than the commission.
My German Bank still maintains their own pay stack. It's also nice that I don't have to share my payment information or commission with another parasite
reply