Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The author fails to realize that not all programming is either web programming or desktop software programming. This is just a subset of programming: software development. It's reasonable to assume that all software development will one day be web programming since a web app is much more accessible than its desktop equivalent. (I personally hope that this is not the case, because with web apps customers pay a regular fee for the server resources they use rather than a one time payment for the program itself; web apps are more expensive for users)

But software development is far from being the majority of programming, most programs are written by people from mathematical and science fields. Since these aren't distributed, it's easy to forget that they even exist. These academic programs are frequently replicated many times by other organizations because if one lab publishes the results of a new study, other labs will want to verify them, and they're not going to use the same code, that would hardly be a thorough verification.

But I don't think that even all software development will become web development. I wasn't round back then, but I'm guessing that people said the same thing about command line programs becoming GUI programs back when GUIs came out, and yet I use vim, tmux, git, countless REPLS and many other command line programs.



Are they more expensive?

What is the cost in time of supporting a desktop app? What is the cost of losing all your data because you failed to put in place a backup solution? What is the cost of having inaccessible data because it's silo'd on a desktop somewhere?

People don't pay for the server resources, they pay for the value added by packaging a piece of software together with hardware and providing support services (sys admins, devops, etc) to keep that infrastructure running 24/7. They also pay for the expertise that keeps that company with great engineers instead of mediocre ones. Server infrastructure is simply the most identifiable cost to a developer, but is not at all what people pay for.

They pay for peace of mind knowing that really qualified people are managing their infrastructure and that they don't have to worry about it so they can focus on more important tasks like running their lives and businesses. If you value your time SaaS is generally a great deal (especially if you're not extremely knowledgable about IT/IS).

Same thing with the app store, you're not paying for the infrastructure, you're paying to access a market segment where people open their wallets to buy shiny things and are willing to pay more for quality.


What I was saying is that the reason the payments are monthly for web services is because they are exactly that services, in addition to products. I, as a student on a low budget, feel more secure buying a desktop app at a one-time cost and being personally responsible for my data there than paying a regular fee forever and knowing that if I stop paying my data could be gone. That's mostly my irrational fear of commitment to just about anything though. But more rationally, the server could go down at any time (EC2 outage possibly) and I could be stuck without my data when I need it. If I know that I'm going to need my data that I have locally, I can store it on my phone and take it with me if taking my computer with me isn't feasible.

And so maybe web apps make you pay more for more. What if I'm perfectly fine with what I used to have and don't want more. I'm paying more for the same if I never plan on using the service anywhere but on machines that I can easily keep synched myself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: