Hacker Newsnew | past | comments | ask | show | jobs | submit | chrisma0's commentslogin

One of the advantages of Overleaf is that I can share a link with a collaborator and they only need a browser to participate. I assume with this, everyone will need to download the editor?


The editor is a web app! Because we build Typst in Rust, we are able to deliver it as WebAssembly and embed it into the web site.


Especially in the research space, I really recommend Twitter. In Software Engineering many really interesting researchers and practitioners tend to share their experiences there. In my experience it's a great way to virtually meet others who you would probably otherwise never encounter.

Heard of "Scrum"? Follow Jeff Sutherland (https://twitter.com/jeffsutherland). Like Conway's Law? Follow Melvin Conway (https://twitter.com/conways_law).


For me the novelty wears out the moment you start seeing these people's family pictures or political commentary. Even if I agree with them, I don't care about it, it's not why I follow them, it's just noise for me.

I'm not saying they should not post such content, don't get me wrong. It's just that Twitter becomes a distraction, your list requires constant maintenance, and the noise to signal ratio is not worth it for me.


I've tried Twitter a few times and I always leave this exact reason. I wish there was some more filtering or tagging options... so I could subscribe to @someone#research or @other#hobbies, and avoid all the noise. Some people have dedicated accounts, but it seemed to be a minority, I wish it was more common.


This is an issue with social media IMO. Not everyone uses every platform, so people without Twitter post politics/memes to Instagram (which I find really jarring) and those without Instagram post personal photos to Twitter when followers might be expecting just opinion/knowledge. Presumably platform owners have decided that channels per account are off putting for most users.


This. Having sub-channels in social media is long overdue. It will be a killer feature. If I want to post baby photos I can freely spam the "family" subchannel who will be more than happy to consume it. Similarly we can cater to a different set of followers with tech posts


I just unfollow anyone who does that. There are quite a lot of people who just stay “on topic”, more than enough to fill a feed. So while I might miss some interesting content from people I otherwise unfollow, it turns out a lot of the time someone else retweets it anyway.

Unfollowing itself when you see something off topic takes very little time.

I do agree with the other reply that being able to subscribe to topics from people would be awesome, but in the meantime it’s pretty good if you maintain a zero tolerance policy for off topic stuff.


Twitter is used small and quite homogenous pct of population. So you might not get a good advice on the Twitter.

Email is great.

My personal experience is that I did not get a good feedback this way. But the feedback from paying customers is gold: honest and clear .


Ah, right, this article reminded me of the fact, that ARM was bought by SoftBank back in 2016. Seems they actually picked some winners, not just WeWork and Wag.


Pink sauce?!

Wait, hold up: "She makes it in a commercial facility that is certified by the Food and Drug Administration", and she's been "serving it to my clients for a year", but also "This is a small business that is moving really, really fast".

A small business using a commercial FDA-certified facility to make a pink sauce and the recipe has apparently been used for a year is moving really, really fast? Really, really fast in the marketing department, I would assume...


"pure HTML/CSS/JS" sounds great, but then there is no demo hosted directly on GitHub pages? Form submission could just go nowhere.

I would love to get a quick taste of what you think makes this special.


Bottom of the post: https://formie.div72.xyz/forms/

What makes it special for me is that it's simple, open source and does not track you. (and the lots of bugs hiding I suppose)


This is one of those headlines that initially reads like a joke until you think of the economics of it. "cost is about 5-10% of the cost of an equivalent system built with off-the-shelf computer parts."

I wonder if this is due to the fact that the Playstation hardware is (was?) competitively priced to encourage revenue generation through games? Or was Sony simply very good at mass-producing these units?


Sony subsidizes the cost of the hardware, in order to sell games. They were not happy with the Airforce doing this and it probably played a role in Sony pushing out an update that disabled Linux on all PS3s.


They were very happy about the air force doing this.

The whole point of OtherOS (and the official Linux port to the PS2 before this https://en.m.wikipedia.org/wiki/Linux_for_PlayStation_2 ) was to get the system classified as a general purpose computer rather than as a game console because that gave them import tax benefits in quite a few jurisdictions.


That's interesting. There must be other products with "odd features" for tax reasons. There was a case where Ford installed seats in a particular model of van so that it could be imported to the US as a passenger vehicle, but they removed the seats before selling. They ended up getting fined of course, but probably wouldn't have if they left the seats in and allowed the customers to remove them...


Lots of products. Its called Tariff Engineering https://en.wikipedia.org/wiki/Tariff_engineering.

Probably the best known example is Converse. "The felt lining on the bottom of the sneakers allows Converse to classify their product as slippers, so the company benefits from a much lower tariff rate." https://blogs.pugetsound.edu/econ/2019/02/18/tariff-engineer...


Similarly, to my understanding, some cameras are perfectly capable of taking video, but disable the feature altogether (or impose a 30-minute recording time limit) to be classified as a still camera rather than a video camera for tariff reasons.


> it probably played a role in Sony pushing out an update that disabled Linux on all PS3s

Pretty sure that was due to it being used to hack the PS3.

> Blame for the latest culling has been pinned on computer hacker George Hotz, who was originally infamous for unlocking Apple's iPhone. In January of this year Hotz claimed that he had successfully hacked Sony's PS3 by exploiting Linux, gaining "read/write access to the entire system memory, and HV [hyper-visor] level access to the processor".

> Hotz released this to the public on 26 January, boasting, "Sony may have difficulty patching the exploit". He may well have been right, since Sony's latest response has been to completely lock off the required 'Install Other OS' feature. Shame on pirates, shame on Sony.


The death of otherOS was entirely to do with the fact it facilitated a lot of the reverse engineering efforts on the PS3, nothing else. As others correctly note, this was largely in response to George Hotz's hacking which required otherOS. Sony actually exposed themselves to legal action in many countries by selling a customer a product with an advertised feature then removing it after the fact, as such behaviour can often fall foul of sale of goods legislation.

Sony absolutely adored that supercomputer project if you lived through this period and followed the company; the idea that the cell processor was a supercomputer for the living room and we'd all be using our PS3s for media editing etc was genuinely a thought Sony had back then. It all fed into much of the (at times ridiculous) marketing for the Cell chip. Sony had plans for more Cell based devices that never materialized too.

> https://www.gamespot.com/articles/sony-gives-glimpse-of-ps3-...

"First, the company will manufacture a high-end workstation using the Cell CPU. Planned for release at the end of 2004... the Cell workstations will be marketed directly to the game and special-effects industries. The labor in their creation will be divided between Sony and IBM. SCE will develop middleware and other tools for game development and film effects. The Cell chips themselves will be manufactured by IBM, who will also work on the OS."


It's a shame they couldn't make it work out anyway. If the cost really was 5%-10% of a similar off-the-shelf cluster, even without the subsidy it would be, what, 10%-20%? That still seems like a steal.

And Sony gets to sell their gaming console as "US Airforce proven" or "a supercomputer in a box" or whatever marketing spin they want to put on it.

And it is only a couple thousand PS3s anyway, so it is a drop in the bucket.

I wonder if there was some behind the scenes stuff going on, maybe IBM worried or angry that this would devalue Cell processors somehow.


I don't feel Sony should have to worry about it. That amount of consoles would have been 1M$ at the time that happened, and represents a minor amount compared to the total sales that Sony had. It's probably a good marketing/PR budget, and messaging like "the PS3 is so good that even the military uses it instead of other computers" sounds like it could make it more appealing for customers.

The main risk would have been that more companies buy PS3 without buying subsized consoles without buying games. But due to the highly specialized architecture I'm not sure if there was ever a real risk of that happening.


You think Sony didn't know the Airforce was doing this? It's not like they walked into Walmart and purchased 1,760 PS3s. It had to have been a direct purchase from Sony.


Walmart certainly had 1,760+ PS3s in their warehouses at one point. So the Air Force could have just direct purchased from Walmart (or Target, etc.) if Sony didn't want to sell to them.

2010: 2882 US Super Centers + 608 Sam's Club + 1578 Mexico stores + 321 Canada stores = 5389 stores

People underappreciated the scale of big box national retail. ;)

https://corporate.walmart.com/newsroom/2010/11/03/walmart-st...


The article states that Sony disabled OtherOS before the USAF got their cluster built, and Sony was recalling and warehousing the PS3s that had the OtherOS feature. The USAF had to negotiate with Sony to acquire these older PS3s that still had OtherOS capabilities.


Almost 90 million PS3s have been sold, so it seems entirely plausible that they could have directly bought 2000 of them through some distributor. Doesn't the military industrial complex prefer to go through its inner circle of buddies anyway?


Having worked briefly in military procurement, I can tell you the system is set up in a very bureaucratic way.

Certainly there are companies that the military wants to buy from. For all the shit about the F-35, Lockheed Martin probably employs some of the greatest engineering teams on the planet.

The C-130, for example, is probably one of if not the greatest aircraft ever designed.

Anyways, there are certain companies that make things militaries want to buy, but for more mundane things like computers and pens and chairs, either there's a negotiated standing offer that legally has to be the first point of procurement, or it goes out to contracts. Unfortunately winning government contracts is a bit of a skill in and of itself and some firms have that skill and others don't.


> For all the shit about the F-35

> The C-130, for example, is probably one of if not the greatest aircraft ever designed.

One of the benefits of not being a "sexy" project is that you don't have everyone and their mother trying to be part of the design process. You can tell the team that designed the C-130 was given two numbers: range and payload, and every other aspect of the design was determined by the engineers.


> The C-130, for example, is probably one of if not the greatest aircraft ever designed.

I don't think I've ever read that before, do you know of anywhere I can read more about why that is?


I'm sure if you google around you'll find some articles but, from my perspective as an aerospace engineer who used to work on C-130s: It's an absolute workhorse.

If you've ever been up close to a museum fighter plane, they're in good shape. The leading edges are all smooth and polished, everything is sleek and in good condition. Line Hercs are not that. They're usually dented and covered in carbon from the exhausts. The leading edge of the wing is like three feet thick. It's a Mack truck with wings held aloft by furious amounts of horsepower.

It's dependable, reliable, and versatile. These things survive being shot at, being landed on gravel, ingesting birds into the intakes, ingesting sand into the intakes. You can start a Herc by putting another Herc in front of it and running the engines up so that the prop wash buddy-starts the aircraft behind, like bump-starting a car rolling downhill.

There are dozens of variants from the gunships to the EC-130 Compass Call and friends which carry serious business ELINT gear for secret squirrels to do secret squirrel shit with. You can put RATO pods on it. You can use it for SAR. You can drop bombs from it (and not even by throwing them out the ramp, which you could also do). You can use it to refuel fighters and helicopters aerially. You can put skis on it and land it in the show. You can parachute from it. It's not a jet, but despite being a draggy brick of an aircraft it'll still pull almost 0.6 of Mach while carrying two hummvees. Also, those hummvees can parachute from the aircraft.

There's a reason it's so widely-used [0] and that reason is because the Herc is groovy. It's the unsung hero of nearly every military operation carried out by NATO and friends since the 1960s.

[0] https://en.wikipedia.org/wiki/List_of_Lockheed_C-130_Hercule...


The Air Force maintains many recreational centers. I have a buddy that ran these for the Navy. Lots of game consoles, etc. I'm sure they could have explained the purchase that way.


IBM cell processor was incredibly expensive.

The PS3 probably was the cheapest way to play with it in practice. There is a reason why modern GPUs share an architecture with video gamers.

It's not about technical advancement, as much as economics. There are two groups of people who want TFlops of SIMD compute. Supercomputer groups, and video gamers.

Cell / PS3 was one attempt at making one device work with both groups, sharing research and economic investment.

NVidia over the next 15 years would execute these economics better however.


> There are two groups of people who want TFlops of SIMD compute. Supercomputer groups, and video gamers.

And crypto people, unfortunately :(


Eh, Eth miners want memory bandwidth, which is increasingly no longer correlated to compute power.


Depends on Proof-Of-Work method.


I mean, it’s also true that not all supercomputer groups want TFLOPS of SIMD. Some video gamers are happy with consoles made in the 1990s. Why single out the crypto people?


90s console very much also had SIMD it was just in more fixed function hardware, DSPs or the like.


I thought PlayStation and XBoxes are subsidized by the Sony and Microsoft respectively as they make most of the money off the games and the network subscription. Nintendo is the only one that does not do that.


The PS3 was an extraordinarily expensive console, between the cell processor and the bluray drive (iirc this alone was several hundred dollars) and in early models also including a whole embedded ps2 implementation. So PS3 was sold at a loss, and not just a small one, but a heavy loss, like several hundred dollars per console, and even still the PS3 was derided for being far too expensive. It was a financial disaster for sony really, moves like ripping out the embedded ps2 make complete sense in that context, and they absolutely changed their business model for subsequent consoles.

Since then, consoles moved away from the exotic POWER/cell/etc custom hardware towards commodity x86 hardware based on integrated x86 APUs and haven't really been sold at a loss outside of maybe a small window at launch. PS5 moved into hardware profitability about 9 months after launch, microsoft said that the xbox series is still sold at a loss but I don't believe them because the xbox shouldn't be monumentally more expensive to build than the PS5. This is in the context of them trying to argue during the apple app store lawsuit that their lock-in on xbox store was different from the lock-in on the app store, so they have a financial incentive to make sure they "run a loss". It's either not much of a loss, or it's hollywood accounting and the money is going into their other pocket somewhere, like making the xbox division pay a parent holding company big licensing fees for on every console sold.

https://www.extremetech.com/gaming/325504-sony-finally-turns...

(again, I don't agree with the "finally" spin here, this article was roughly a year after launch and they may have been turning a profit for a while before disclosing it... the consoles themselves become profitable pretty quickly.)

However, this mindset that "consoles are sold at an initial loss" still persists. They're not, Sony has said they're selling the PS5 at a profit. Previous generations also reached profitability pretty quickly after launch as well. It's not 2005 anymore and the ps3 is gone. Slapping some GDDR5/GDDR6 on a semi-custom APU is dirt cheap.

Even during the launch window when they do lose money it's much smaller, nobody is losing a couple hundred dollars on each console anymore like on the PS3, that model is gone.


>The PS3 was an extraordinarily expensive console, between the cell processor and the bluray drive (iirc this alone was several hundred dollars)

Sony singlehandedly did a lot toward adoption of both DVD and Blu-ray, by the PS2 and PS3. The former's ability to play DVDs out of the box was a huge differentiator between it and GameCube/Xbox. My understanding is that PS3 at launch was not only the cheapest Blu-ray player available by several hundred dollars, but also excellent quality.

>It was a financial disaster for sony really, moves like ripping out the embedded ps2 make complete sense in that context

You can see that in the history of PS3 variants. It's normal for consoles to get a major redesign to cut costs late in their history, typically just before or after the successor is out. PS3 redesigns were a) many and b) very early in the console's lifetime, relatively speaking.


I'm fully willing to believe the Xbox/PS5 is sold at a loss today, the margins are so thin that tiny changes in component prices could have enormous implications on how much money each hardware unit delivers. Neither of these consoles are iPhones, they don't have profit margins of 40% (or probably any double-digit percentage, for that matter). Transitioning from esoteric hardware has pretty much nothing to do with it, anyways: the N64, Gamecube and Wii all used non-standard architectures while being ludicrously profitable. The only truly significant advantage to using x86 in a home console is how easy it is to port/develop titles for it, not a single current-gen console uses commodity hardware besides the Nintendo Switch (since the Tegra board is commercially attainable).


Nintendo definitely has a different business strategy than Sony or Microsoft.

Anyway—the cost of esoteric / more custom hardware got higher, that’s why the console manufacturers moved away from it. It would make sense to shove a lot of custom hardware in your 3D video game console in the mid-1990s, because there is simply no other way to do good real-time 3D, and you have SGI who’s willing to sell you chip designs.

As time went on, the approach of shoving big custom ASICs in your console starts to look worse and worse. Most of the CPU vendors that previously sold you all sorts of architectures like 68K, MIPS, POWER, Cell, etc. stop trying to compete with x86 hegemony. Meanwhile, you’re making life more difficult for console developers, because these custom designs are just so different from everything else on the market.

So you get the PS3, which is expensive to manufacture, and requires a lot of specialized work to program the SPEs (painful for developers). That’s two generations after the N64, and the world has changed.

I would also be less likely to call the Gamecube/Wii architecture exotic, at least compared to the PS3.


Weren't both the Wii/Gamecube and PS3 PowerPC-based?


> I'm fully willing to believe the Xbox/PS5 is sold at a loss today,

Given that sony themselves have said they're turning a profit, this sounds like a personal exercise in making yourself believe a counterfactual. Some people are into that though, like the flat earth stuff, or the people who think finland exists. See what you can talk yourself into believing, even when the facts are right there ;)

Anyway, your personal belief or disbelief or willingness to believe or disbelieve is kinda irrelevant here, given that sony has said it themselves.

> Transitioning from esoteric hardware has pretty much nothing to do with it, anyways

Yeah actually commodity hardware does have a big role in bringing down costs. Semi-custom APUs are commodity hardware compared to the standards of esoteric Cell/POWER stuff, and actually some variants are available off-the-shelf as well (see Ryzen 4700S which is a PS5 APu with its gpu disabled).

The fact that some custom systems were sold at a profit in the past is kinda irrelevant. The era of "commodity x86 APU with a wide gpu and GDDR memory" is qualitatively different from the era of cell, power, MIPS (PS2), and worst of all sega saturn. Nobody does the "our console is actually eight different processors in a trenchcoat segmented in three busses that you have to juggle in realtime to keep everything fed" anymore like the sega saturn or cell. And no that's not an exaggeration Saturn had eight different processors that all needed to be juggled... two cpus, a sound controller, a sound processor, two video display processors, a coprocessor dedicated to managing loads off the cd-rom, and a system controller, all with different capabilities and bus access. Same for cell with its weird-ass processing element model with a ring and no access to system memory, etc. Those are far far different from the way x86 chips (even semi-custom APUs with different buses etc) are designed and the cognitive load was huge for developers.

Microsoft was ahead of the curve in the sense xbox was a semi-custom intel processor and an nvidia gpu, and xbox 360 was a semi-custom power processor and an ATI GPU, but Sony kept at it far too long. They bet everything on cell, the original idea was that cell could also be a gpu on the same chip but it performed so badly they had to add a commodity GPU at the last minute to try and fix it, but that left them with a cpu with a super-weird programming model and completely undocumented opaque hardware that was a nightmare even to bring up a hello world application on. Then they went "never again" and went commodity x86 SOC with everything integrated, alongside microsoft. That brought costs down a ton and fully aligned them with what was happening in the PC space.

The overall trend was clearly from the arcade/sega saturn era of highly custom, arcane architectures with lots of individual weird chips towards "CPU+GPU" arrangements and then finally just integrated APUs. And that's also the exact same time when they stopped selling things at a loss - the move towards integrated, semi-custom commodity architectures was a major part of that. Xbox One and PS4 and their refresh consoles and pro versions both moved into hardware profitability very quickly.

Also, both of your examples of profitable custom hardware were nintendo and they have always been notorious for going really cheap on their hardware. The few times they haven't, they've gotten burned.


This is my understanding as well. A big part of this was the HD-DVD v Bluray war around that time - Sony winning with Bluray would be worth millions of dollars more, so they were willing to take a loss to get a Bluray player in each person's home.


I bought my first XBox because the XBox One Series S was at one time one of the very best 4k Blu-ray players on the market at any price and also one of the less expensive ones. The fact it could also play games was merely a bonus to me at the time.


When it comes to network subscriptions, at the time of this article Playstation's online services were pretty much all free. It was one of the differentiators compared to Xbox's paid online services at the time.

That said though, the original PS3 hardware was definitely sold at a negative margin and wouldn't be profitable until four years into the console's life.

Article from 2010 talking about how the PS3 hardware had only just become profitable: https://www.pcworld.com/article/512740/article-4244.html

The console released in 2006.


This is largely not true these days. The Switch sold at a loss at-launch, but slowly turned a tiny profit as shipping prices went down. The modern revisions (Switch Mini and OLED) are also priced similarly, so any profits they're making off the hardware itself is incredibly marginal.


>I wonder if this is due to the fact that the Playstation hardware is (was?) competitively priced to encourage revenue generation through games? Or was Sony simply very good at mass-producing these units?

By 2010? Likely both.


Sony sold the consoles at a loss, and likely had the CPUs for much cheaper owing to having bankrolled the chip itself.


Navy submarines use XBox controllers to control parascopes.


The USAF uses them for Predator drone flying as well.

I'm not really a pacifist but there is something profoundly unsettling about real-life Call of Duty.


Before this, the M67 grenade was designed to be familiar to soldiers that grew up throwing around baseballs.

I find that incredibly similar to this.


More likely game controllers over the past few decades have evolved to be really optimized for controlling the position and orientation of an entity in 3D space which happens to be the same problem for characters in videogames and many objects in the real world, and their ubiquity makes them both cheap and easy to integrate.


And being designed to withstand children and pro-gamers alike makes them reliable and accurate!


Also used to control drones and robots in other branches of the armed forces.


periscopes ...


> “At least you can say that in Pratchett’s books, the bloody elves never sang!”

Terry Pratchett I think is one of my favourite writers to read. The absurdity of his fantasy settings is just the right level of entertaining for me. Everything flows so smoothly that I sometimes get the subtle jokes only on my second read-through.


There are many many jokes that are so obscure as to be almost impossible for non-British readers to discover. A trip through the l-space wiki will point out many (and there’s some I’ve noticed that aren’t listed there).

Anytime something is named, it’s probably a joke or reference of some sort.

It’s also quite fun how many of the “inpossible” setups or situations are just literally copies of real-life stories.


For a modern and non-British take on this genre I would strongly recommend The Tales of Pell by Kevin Hearne and Delilah S. Dawson, if anyone’s looking. It’s got that same vibe of absurdist humor with real life references in a fantasy setting. The politics are less about class and more about identity.


Hey, thanks for the recommendations! These sound great.


Last Continent instead makes a lot of Australia jokes. Though the first one I remember is the brewery built on the opposite of an ancient sacred site which the aboriginal peoples actively wanted desecrated.


So this is the killer application for the blockchain? Not a store of value, but a store of chess games?

In all seriousness, I'm not sure chess needed an "entertaining crypto layer" or "NFTs tied to pieces"... Best of luck


This reminds me of branching in PlanetScale (https://docs.planetscale.com/concepts/branching). I am not an expert, though. Is this a similar approach in terms of use case?


The branching in PlanetScale is for schemas (your team branch out to develop your database, and merge the branch when new development has completed)


I really feel like the lack of well-structured citations is what's keeping many Wikipedia articles back. It's just not good enough to throw some paper reference (potentially behind a paywall) or a news article after a full paragraph of text.

Often when editing a page, I am left wondering where exactly the previous editor has gotten a fact from. The Wiki markup even allows "quotes", i.e. short extracts of what exactly you're referring to, but I've yet to see anyone beside myself actually use it consistently.

Having a structured list of references for a given topic in a Wikipedia article is incredibly helpful!


Great points! In an ideal world, editors would note exactly where in the referenced work they found the information — https://en.wikipedia.org/wiki/Wikipedia:Citing_sources#Ident... has some guidance on how to do that :)


I mean, there is even stuff like linking to pages in PDF files... but I feel like it's another feature that's rarely used.

Maybe, in an ideal world, I could imagine an editing experience with a split screen editor with the article text and the references open at the same time. Right now, it does feel like references are an afterthought in the editing workflow.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: