Hacker Newsnew | past | comments | ask | show | jobs | submit | more cdev_gl's commentslogin

If you scroll down a bit there's a wireframe of the skeleton which is what's actually being animated, and you'll notice it's lacking in bones to define the fingers or even possibly hands. Hence why the hands maintain that weird pose throughout all examples.

My gut says that the quality could be rapidly improved without changing the underlying design at all.

The real issue with this, I think, is that motion capture for humans is already widely available and provides much higher fidelity and control than text. Unless I'm misreading the paper badly, this model was trained on exactly such data. Blending between multiple animations through motion capture is also well-understood.

So while the results are impressive, the practical gains seem very marginal. I think perhaps that the equivalent to "inpainting" (as mentioned in the text) and "style transfer" would be the big gain here? If we could use this to retarget animations to different body plans (child, adult, space monster) quickly, or for smarter interpolation between human-authored keyframes, I could see that being a much-desired tool.


I dunno, as an amateur animator and game developer this would be a huge help to me. I have a first gen Perception Neuron suit, I even wrote an addon for Blender that retargets the Neuron output for the Rigify rig.

But it's cumbersome to put on and take off, and to operate, especially when working alone. While I'm in pretty good shape, there's heaps of movements (eg. martial arts, swordplay, firing/reloading a gun etc) that would probably look silly if I performed them. I can see this being very handy at least for prototyping animations at the very least.

Replacing finger bone positions is pretty trival in Blender as well using the Pose Library feature so the lack of finger data isn't that much of a big deal.


This is very informative, I never knew people "did this at home"


The reason there’s likely no finger joints is because a lot of motion capture data doesn’t include fidelity beyond the wrist.

So if they’re training on the standard corpuses of motion capture data available and even mixing in their own, they likely won’t have fingers to base data on.


If you look very closely, the model does have wrists. (Most noticeable on the models left arm — that is the viewer’s right.)

Either way, the shoulder and elbow joints don’t move much during rope skipping, and no matter how it’s recorded, the motion capture data that was used for training should reflect that.

My best guess is that the model has picked up on the tiny arm motions that are present in rope skipping and wildly exaggerated them for some reason.


So today the issue with traditional animations is that you get all humans have same height, same proportions, like for example in The sims games all adults have exactly same height though you have different body shapes. If you use mods to change height, or legs length then things no longer align, like siting on a chair. This means chairs, beds,tables, doors all have the same height.

Not sure if this diffusion is the answer but some smarter way to extrapolate existing animation to work with bodies and objects that are 10-20% different and look natural.


Motion capture is expensive and laborious.


That wireframe is gonna be sore. Lifting with it's back instead of it's legs...


The film Arrival (2016) is well known for hiring consultants- experts in linguistics, other scientists, even Stephen Wolfram- to improve its portrayal of an alien language and the people working to decipher it.

Nobody complains when a consultant is hired in this capacity, and yet sensitivity readers, whose role is broadly similar, are somehow considered bad? It's not like they rewrite your whole book- they offer notes for things that don't make sense given the characters and plot, just as a developmental editor might.

A publisher doesn't just print the text they're given, they edit it, suggesting changes they believe will help it be better (and thus sell better). There's a back and forth dialog between editors and authors, and it's entirely possible either side might walk away if there's no willingness to compromise. This happens to all books that aren't self-published. Adding another type of editor to the mix isn't gatekeeping.


The article already brings compared those consultants with sensitivity readers.


I'd wonder if that's an artifact of the source data, drilling down in the possibility space to be more like some subset that duplicates the image label- for example pulling tweets with body text and alt text.

Alternatively I guess it could just pull harder towards the prompt, idk.


When I buy a physical book, I can lend that book to a friend. I own the book.

When a library buys a physical book, they can loan that copy out to patrons. They own the physical book.

For over ten years now, libraries have taken a single physical book, scanned it and removed the book from circulation, and lent out a single ebook. This has been a common practice founded on an interpretation of fair use laws, which allow format shifting (recording a vhs to DVD, for example) and lending, and also copyright laws which explicitly mention the legality of digitizing books for accessibility reasons, to serve the visually impaired.

This "one physical book in storage equals one ebook" lending is the "controlled digital lending" the archive does, and is the issue at stake in this suit. Publishers argue that libraries should not be allowed to lend any ebook except those which they set a specific lending limit to- charging the library a fee every X number of checkouts or every X number of months.

This is obviously not how lending physical books works- the library can buy a hardcover and loan it out to patrons, repairing the spine and binding over and over, eventually moving it to a special collection for viewing on site when it's hundreds of years old and can't be handled.

As a seperate, and legally shaker issue, the internet archive created the "national emergency library" during the pandemic. The logic here was that nearly every library in the country was closed, there were within those libraries physical copies of these books that weren't being lent out, and so the same concept as controlled digital lending could apply. This move by the archive was signed onto and supported by thousands of libraries around the country, who agreed that their uncirculated physical copies could stand in for the extra copies checked out of the archives emergency library.

It's important to note the archive does not loan any book published in the last five years, period. This is a self imposed limitation that your neighborhood library does not work under, but it means that the "lost sales" publishers allege were not of new books. Nobody ever got screwed out of bestseller status because too many library patrons read their book, and that goes doubly so in this case.

All that said, the issue isn't whether the internet archive messed up with it's national emergency library. It's whether the whole concept of loaning a book is limited to printed copies, and thus ebooks must be repurchased by libraries ad nauseum-- or if libraries can keep a physical copy in a storage room and loan out a digital copy one at a time, exactly as they do now.

Bottom line, our libraries (and especially their ebook collections) will be much poorer if this decision goes to publishers.


> Publishers argue that libraries should not be allowed to lend any ebook except those which they set a specific lending limit to- charging the library a fee every X number of checkouts or every X number of months.

so this is yet another attempt at digital feudalism, where you bought something, but must pay the supplier a subscribtion or a cut forever?


This is basically similar to the process I've used to turn my phone into a dev/writing environment for spending months thru-hiking in areas without internet.

I use termux to host a jupyterlab instance and bring along a tiny folding bluetooth keyboard. Even in airplane mode I can connect to the locally hosted jupyterlab instance. Notebooks are kept in git and synced up to my private repo when I'm back on grid.

There are simpler solutions for writing, but this allows me to keep a single workflow across home/travel contexts. Being able to run graphs, basic code and computation as needed is also a plus.


My older mac book recently died and instead of buying a new laptop I'm using termux to connect to a droplet for side projects. I've tried two different bluetooth keyboards so far and found one I liked with back-lit keys. Also recently setup wireguard on a raspberry pi at home and had fun working on my droplet through my VPN while on a flight. What a time to be alive.


You could probably set up Tailscail ssh that was recently posted here on HN. Though why do you need to go from VPN to droplet?

[1] https://news.ycombinator.com/item?id=31837115


Cool, I'll check it out. It is an extra hop, but my fdroid termux install has been blown away (along with my ssh key) several times by the play store version. So now I save my key on my pi instead. I did turn off updates from play store for termux so hopefully it doesn't overwrite termux again.


Termux does not update on the play store anymore since 2020 or 2021. In fact, your repositories are probably outdated

If you ever need to upgrade, you have to install from somewhere else, like F-droid


Thanks, I think actually it was something else that made me re-install from F-droid, but I'm glad I can rule out the play store.


I don't know what is braver, if hiking for months or developing on a Bluetooth keyboard and a phone.


I don't know about brave, but the funniest thing I've done with my phone and keyboard is access an amazon workspace to run adobe InDesign for magazine layout while in my tent— sitting in a sleeping bag, a cold rain pattering against the "roof", with my back resting against my backpack and my phone angled nicely in my lap, using the keyboard's attached touchpad to carefully move and assign articles, tweak colors, and do the final bits needed for printing.

There was definitely some squinting at the screen involved, heh.


I’d love to know more about that. Isn’t running Adobe CC on AWS prohibitively expensive? You need a decent video card and at the very least 8GB of RAM.

Couldn’t you buy a laptop with what they’d charge you for a day of work? A quick simulation here is showing me US$ 140/h for a 4 core 16GB of Ram machine, no mention of video card.


You can easily stow a portable solar panel in a hiking bag that can charge your phone and the other accessories. A laptop would have a harder time getting charged, even if it was a Chromebook. Portable battery packs intended for phones would also be easy to stow. Not only do they fit more compactly, but they are light on your weight budget and not as fragile as lightweight laptops.


Sure, but I’m talking about price. AWS Workspace seems crazy expensive, to the point of being unusable almost.

I love the digital nomad lifestyle, but at that price it’s more of a gimmick than a real choice


A g3s.xlarge running Windows is just under a dollar an hour - 4 cores, 30GB ram, 8GB video ram on Nvidia GPUs.

$140 gets you a t3.xlarge 4 core 16gb ram instance (on linux) for a whole month.


But that’s not Amazon Workspace he mentioned. That’s just EC2.

Check out the hourly price of Workspace:

https://aws.amazon.com/workspaces/pricing/?pg=workspaces&sec...


Sorry this comes a bit late, but the answer is you don't really need a performance machine for doing layout. Since Adobe's apps were originally built 30 years ago, the minimum requirements remain quite low. Moving images around a page can handle quite a bit of lag. You don't even need a rented machine with a quality GPU.

I pay for this workspace: $9.75/month-Hourly-Windows licensed-Performance-2vCPU,8GB Memory,80GB Root,50GB User

Since I'm only doing layout on an as-needed basis while hiking, rather than doing it full time, I tend to spend around $45 or less a month.


“Death and rebirth, trying and overcoming—we want that cycle to be enjoyable. In life, death is a horrible thing. In play, it can be something else.”


I rent a small Linode to run a minecraft server for my extended family. Sometimes it struggles to keep up with just a half dozen of us, and when I tried swapping from a spigot based system to a forge one so the kids could have mods it was basically unplayable. I can't imagine scaling out to thousands of users, this was some amazing work.

While developing this did you come up with any best practice recommendations for individual nodes?


Your single-core performance is the most important thing for a Minecraft server, especially for Forge mods.

For vanilla, I’d try Paper (a high performance spigot fork) and see if you still have problems. If you’re lagging with 6 people while running Paper, you simply need a better CPU.


Tangentially related, I had to move a web app off of DigitalOcean because the CPU performance was terrible compared to what you get for $80/mo from a dedicated server from OVH. I know Linode and DigitalOcean compete. Get a OVH dedicated server, try the CPU passmark bench or whatever and compare. AWS + Linode + DigitalOcean are all VPS and pretty slow.


It might be too pricy, but if you go to about 25usd/month, you can get a dedicated server with an i7 from hetzner server auction. These cheap boxes don't have ecc ram etc - but should work well for things like this.

https://www.hetzner.com/sb?country=us


On the page, one option advertised is for €23.53/month with an Intel Core i7-4770, 2 TB HDD, and 32 GB RAM. What’s the catch? How is it possible to rent such hardware for so cheap?


There is actually a catch (besides very little customer service, which is par for the course with cloud stuff anyway), but it's not really their fault. Hetzner doesn't peer with a lot of bigger players, and therefore its datacenters can have really lousy / inconsistent bandwidth and latency for people outside a relatively small area in Europe (and can occasionally have bizarre outages that are presumably related to partitions). If that's not a dealbreaker for you, then they're one of the best deals out there if you don't want to manage your own hardware and don't need really efficient access to other cloud services like S3 (which obviously work much better within Amazon's datacenters).


No catch. Cloud services are extremely expensive when it comes to computing. If you need raw power, rent a dedicated server.

Edit: For a little more a month, you can get a Ryzen 3600, 64GB of RAM and much faster NVMe SSD: https://www.hetzner.com/dedicated-rootserver/ax41-nvme


Ehh you can get cheaper compute in Fremont if you're willing to rent >=1 rack. I'm guessing you only think €23.53/mo is cheap because you've been getting reamed by AWS pricing :P


Can I have a link please? I'm a sucker for cheap dedicated servers :).


I use he.net/colocation

To be clear, I'm taking about racking your own servers, not cloud "dedicated servers"


I'm not sure what you imply with the quotes around dedicated servers - this is hardware for rent, not VMs (hetzner has VMs too, starting from about 5usd/month - but that's something else).

You do need to go a bit up in price for proper server hw - although there are a few aging xeon boxes with ecc ram on the low end now.


They are second hand servers - if you want brand new servers you pay a setup fee, and more pr month. Once you upgrade, or cancel, that (once new) server goes into the auction at a lower price.


Those are ancient servers, long since paid for, you're paying for 1.50 worth of bandwidth and basically some electricity and a little bit of "support"


If you go with Oracle Cloud, you can get an 4-core, 24gb RAM aarch64 server for free. I've been using it for minecraft and it performs excellent, especially combined with PaperMC


Note, looks like oracle cloud has a nice free tier, but that's not a dedicated arm server, but a vm (but still, with 24 cores):

https://www.oracle.com/cloud/free/?source=:ow:o:p:nav:0916BC...

> Infrastructure

> 2 AMD based Compute VMs with 1/8 OCPU* and 1 GB memory each.

> 4 Arm-based Ampere A1 cores and 24 GB of memory usable as one VM or up to 4 VMs.

> 2 Block Volumes Storage, 200 GB total.

> 10 GB Object Storage.

> 10 GB Archive Storage.

Just to compare with Hetzner - you would typically be able to get 16 or 32gb ram, an i7 with 4 cores and 2x1tb disk (no ssd). I'm guessing single core performance might be higher than the arm offering - and a better fit for minecraft.

Ed: although maybe core count would win out with PaperMC?


The cheapest available right now was 140 euro. When do the 25 dollar server come out?


They are in a separate category called Serverbörse [1]. The cheapest currently available is for 28€ a i7-3770 with 16GB RAM and 2x3TB spinning rust.

[1] https://www.hetzner.com/sb


If the cheapest is $25/month, I think it would just make sense to get a used school/office PC. I was able to pick up a used Haswell i5, 8GB of memory PC for $200, which you would probably get better performance out of, and break even at ~8 month mark. That being said, pandemic prices might shift those numbers.


Nothing wrong with getting a used pc, but do you have 1gb uplink at home?

In the cheapest offers right now, there's a Intel Core i7-4770, 2x 2 TB Ent. HDD (spinning rust, not ssd) with 32 GB ram. And a xeon box with ecc ram (same, low price).


Also, even if you do have a 1 Gbps uplink, a lot of providers will throttle you if they suspect you're actually hosting a server that tries to utilize most of that bandwidth in anything but bursts.


For example, 1TB monthly is really only 10mbs...common quota for hosted servers.


I can see few options for 23.53 EUR right there

Either way, it's auction page, may not always have the deals you're looking for in this price range


When someone cancels their old server and they recycle it into the auction system


Assuming the Minecraft server runs on Arm64, you may want to try out Oracle Cloud free tier.

You can spin up a 4 core 24GB instance, the CPU is 4 dedicated cores and quite fast.


It does run on arm64, I am currently running a forge server with a few mods on this exact setup. It's just for a few friends, I don't think we've had more than 5 or 6 on at a time but it seems to be performant enough.

It's not perfect but it ended up being a better experience than we were having with minecraft realms.


What were the specs of the server and which Spigot fork did you run?

A month ago I ran a Paper instance from a Digital Ocean instance with about 4 GB of RAM and OpenJ9 JVM and it never dipped below 20 TPS even with a larger render distance. This was on Vanilla 1.17.


Have you noticed much difference between a HotSpot-driven JRE and OpenJ9?

I am somewhat irrationally biased against J9 because they made us stick it in everything at IBM, but I'm willing to reconsider for better Minecraft performance.


I have yet to see hard data on it, but the folks that have been in the trenches doing client-side mods on Minecraft swear that J9 is the superior JVM for the latest versions of Minecraft (1.16+). I don't know that anyone has really done a proper benchmark.


> a small Linode [...] it struggles to keep up with just a half dozen of us

I run my server on a 32GB Ryzen5 PC with the world on an SSD and it often struggles with just 2 players no matter what options I tweak. Keep dreaming of the day the Java parts get the same performence as the Bedrock parts (but I know it'll likely never happen.)


Something wierd about a vanilla minecraft server is that sometimes less RAM is better. If you are using a good enough cpu going from 4 gb allocated to 2 gb allocated massively increases performance for low numbers of users because garbage collection runs more efficiently. You can tune the garbage collection manually, but super counterintuitively this fixed most of my problems.


Just like in programming, different spoken languages have different capabilities-- certain ideas are easier to express in one language versus another. Some concepts don't even have words across languages. And there's beauty and rhythm, the way words rhyme, so even if you can express the same concepts, the network of how those concepts play off each other is different.

Having shared languages is awesome, allowing people to communicate together. But having only one language would be a great tragedy. It'd be like saying all paintings can only use one color.


I don't think the painting analogy applies. You can add words and meanings to languages. If we all spoke the same language, we'd iterate on it until it could express all the ideas that we have. We may lose some rhyming along the way, but it seems a small price to pay for universal, clear communication.


Eventually new languages would be created. Why? Because everyone wouldn't iterate at the speed or even the same direction.


Right, but if we standardize the language, we can keep track of and de-duplicate all the iterations, much like we do with web standards.


I made this same tech-stack decision while working for a company in the manufacturing industry. React, GraphQL, Node.js. There was a reason behind it.

The head of the company was pushing to modernize our process flow via software meant to drive manufacturing. We had contractors on site from three different companies who were each using us as a test-bed for their software/manufacturing integration. Every week or two they'd plot out some new data they'd need to move from sales to engineering, manufacturing, QC, etc.

In our case, having GraphQL as a translation layer for the sales website saved everyone involved time. However, I can also see many scenarios where that wouldn't have been the case.

It definitely comes down to using the right tool for the job. Knowing how to identify which tools fit and which don't is one of the skills it's very important to help new devs develop.


I'm always impressed by the way a simple turn of phrase can hide disproportionate impact on human lives.

Specifically interesting in this context is that the turn of the 20th century, and the 1920s in particular, is really the birth place of modern immigration policy.

Of note is the fact that the requirement of visas for entry and quotas on point of origin appeared around this time- if you read newspaper op-eds for the immigration act of 1924 (or the earlier chinese exclusion act), these measures are specifically pointed out as a way to maintain the racial makeup of the country.

Prior to that point, visitors could just turn up at Ellis island for entry. Afterwards, the island itself served more as an offshore detention center for what were then classified as radicals, anarchists, and those "likely to become a public charge"— awaiting deportation.

Citizenship, too, could once be had for as little as the price of a dozen eggs-- check out this booklet published 1921: https://archive.org/details/howtotakeoutyour00metrrich/page/...

"All it costs is $1, not a penny more. No witnesses, no examination, no pull required." -- 2 years of residence later, another set of papers (also on Archive.org) and a short literacy test (added in 1917) and you could be a citizen-- but this rapidly changed as additional quotas and restrictions were added.

The history of immigration law is a fascinating topic, and there's a lot of interesting primary sources available.


Imagine that "booklet" is the same as the ADs you see when you browse the internet. And they are the equivalent of " You are the XXX visitor and Won this."

Family story:

My Gran Grandfather was from Trento Region in Italy (North), very poor. At these times, there were some organizations working with migrants. (take note)

So he went to migrate from famine and poor.

Once on this organization, they asked him his scholar level, and he was like 3rd grade. Not good enough for the US was sent to Brazil. (note 2) He paid his fees, jump on a shit boat, and three months later, he was in Brazil.

Then arriving, the support he paid for previously was non-existent; all good land was already taken by the Germans, so they only one left were in the rocky mountains were because of the rocks that were very hard to work on the fields (note 3).

What I mean is: what looks easy today was so hard as yesterday.

And please people long ago were not better from today. I do think they were worse.

Note 1: These organizations at that time earned a lot of money with "fees."

Note 2: The people from the south with a more relaxed moral usually lied about this and were sent to the US.

Note 3: It was later known for good wine and tourist places. (grapes love rocky soil and can grow there).


That booklet is fascinating reading. It's a little bit wild to see these on opposite sides of the same sheet of paper:

> The true American believes in Liberty, Equality, Justice, Humanity. [...] The true American believes that "All men are created free and equal." [...] The true American believes in his own ability, but holds that the other fellow is as good as he and should have the same chance to life and happiness. He believes in equality of opportunity.

> Persons not belonging to the White or Black Race, Anarchists, Polygamists, Criminals, Insane, cannot become Citizens; they will not get First Papers.


And also that Africans were considered more desirable immigrants than Asians!


Some early immigration cases involved people trying to prove they were black to avoid removal.


A literacy test in early 20th century would already exclude a lot of poor people by definition. not sure why you make it sound like it was a formality.


> A literacy test in early 20th century would already exclude a lot of poor people by definition.

That's actually an empirical statement, it wouldn't be by definition. (Unless by poor you mean "can't read", then it would be by definition.)


I observed that "by definition" is on its way to becoming one of these emphasis phrases, like "literally", no longer literally meaning what they used to.


Yes. Though even in its literal meaning 'by definition' is a weaker statement than 'by evidence'. At least in any argument outside of mathematics.

Most of the time the 'by definition' argument just gives you a True Scotsmen.

Take the classic: all men are mortal (by definition in Ancient Greek), Socrates is a men, thus Socrates is mortal.

As an empiric argument, this would make sense. But the Greeks tried to reach 100% certainty. Mortality was included in their definition of a human (otherwise you'd be at least a semi-God or so.) Of course, that just shifts the uncertainty to the second part: we can not be sure that Socrates is indeed human by that definition, until he actually keels over.


The literacy test was to gain citizenship after you had already been granted permanent residency, so while they wouldn't be eligible to vote in federal elections, illiterate people would still enjoy almost the benefits and rights of any other US person. Almost anyone can learn to read, it's not that hard, almost every 5 year old learns to do it. If an adult in the early 1900's wanted to be a citizen, but were blocked by a basic literacy test and still didn't learn to read it's because they weren't interested in putting the effort into it (there are people with severe disabilities, that is a totally different situation and i'm not addressing that). Of course it's way easier to be born rich, but if you think someone who is poor but motivated can't learn to read you are insulting their intelligence.


The literacy test was a barrier to entering the country, not to citizenship. https://history.state.gov/milestones/1921-1936/immigration-a...

And while that one didn't have a very large effect, the one that did block people from voting managed to filter out black people very well:

https://metro.co.uk/2017/09/20/could-you-pass-this-test-give...


To say that "anyone can learn to read, it's not that hard" because a 5-year-old can do it is disingenuous. It's been well established that children learn language and reading skills exponentially easier than adults due to the "flexibility" of children's minds that is lost with age. It can be nearly impossible for some adults, especially ones with no reading experience when they were young, to ever learn to read.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: