Hacker Newsnew | past | comments | ask | show | jobs | submit | fretlessjazz's commentslogin

Hey there-- James from Dasheroo here. We're looking into the iOS scrolling issues on our homepage; sorry for the inconvenience!


Not just iOS. Chrome on Windows here, and I get the janky scrolling/CPU fan turning up as well.


FF 38.0.1 on Win7 here. Same problem.


Chrome on linux was fine


It's pretty smart, actually. They don't want to ruin their SEO.


um, im sure that Google had no idea about Wikipedia's plans and let their spider autocrawl the domain without manually overwriting its settings.

I would rather find it in Google's best interest to keep Wiki pages high up in results, btw.

Further, Google spider can "apply" CSS and "see" which elements of the page cover other content with divs.


I run Rails and became tired of seeing 404s to standard ASP or PHP software (such as phpmyadmin), so I added this to our Apache conf:

RewriteRule \.(asp|aspx|php|jsp)$ - [F,L,NC]

RewriteRule (w00tw00t) - [F,L,NC]

RewriteRule (phpmyadmin) - [F,L,NC]

RewriteRule (php-my-admin) - [F,L,NC]

That cuts off those requests before they hit a Rails process and suck up any additional resources.


On Lighty, I simply have:

    url.redirect = (
        "^(.*)php(.*)$" => "http://www.kernel.org/pub/linux/kernel/v2.6/linux-2.6.37.3.tar.bz2",
        # other stuff
    )
I do not use php on the server.. I don't know if these kits end up downloading the kernel or not, though.


Please don't do this... having bots launch a DDoS attack on kernel.org is not good.

Just throw the request away or return a 404 at the load balancer level.

If you're on Apache use mod_security, if you're not put Varnish in front and configure it to return simple 404 errors on such pages.

But don't mod_rewrite, redirect or otherwise throw traffic onto someone else's server, let alone one that will result in a traffic cost for them.


> Please don't do this... having bots launch a DDoS attack on kernel.org is not good.

Yeah, point them at microsoft.com instead! Should be easy to find a hefty service pack or DirectX install for the bots to hit...


Even though not all of us like Microsoft, you still shouldn't do this. The best way to handle this is to send random data at 10b/s and slow down the bots.


It'd be interesting to keep a list of the bots, and randomly redirect the traffic back at them. My first thought was that this would mess up people who unknowingly have a bot on their computer, but then I realized this might actually make them look into getting their computer fixed.

Am I missing something here, or is this actually a decent idea?


I suppose you could always redirect to 127.0.0.1. Maybe even go for a port that's likely to be open on a statistically random compromised system, like 135 (Windows DCOM, can't close it to localhost without breaking like half the system).

Edit: I just tried this in IE on my Win box; the connection even stayed open for a good long time! Firefox blocked it, though, which is probably good.


I doubt these bots can handle the redirect request. Its js and I don't see why someone would code to support it. Maybe someone better informed than me can say whether curl or wget respect redirect by default.


JS? 300 http codes cause a redirection without any JS whatsoever.

http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html


There is an extension to iptables that adds a TARPIT target

http://xtables-addons.sourceforge.net/


Better would be to redirect to a third party service that offers that.


Are there any? I think that's weekend-project sized and donations could support it.


Is there a javascript folding@home client? A better move would be to make bots do something useful for humanity.


Yeah, I'm not sure that kernel.org is the right way to go here... plus, I'm pretty sure they'll not be happy.

But, I do wonder if there is some other way to do the same thing. Perhaps we could setup some kind of tarpit like server that sends out a file very slowly... like .1K / sec (~1 packet every 10 secs). Just enough to keep their connection alive, but slow enough to not use too much bandwidth.

But, I'm not sure if this would be any better than just sending a 404 quickly.


I'm not sure kernel.org would appreciate that.


Austin startup here as well. I actually moved from San Francisco to Austin to launch http://www.ideaffect.com.

Also, hey Jason! Still owe you a beer sometime.


That's awesome, I was going to go with UserVoice for my feedback widget, but I'm gonna have to support a fellow Texas startup and go with you guys now. I'm based in Dallas myself.


My first customer came from my own personal network as well.

The first customer that did _not_ come from my personal network was a result of posting to app directories such as feedmyapp.com and the like.

The best advice I can give on signing and keeping your first customer is to _make them happy_. Be nice. Crack jokes. When they call or email you, respond immediately. Your first customers are really important because they're vetting your business model in addition to trying your product.

Accept/understand that, as you observe your first customers interacting with your product, you're going to have to make changes. Make them quickly and reasonably.

Every company is unique, but that's how I found and retained my first customers.


I'm not a design genius-- but when I saw Gap's new logo I think I knew what they were going for, and imho I doubt it was created haphazardly. The new logo is exemplary of the transition of "best practice" design principles from print to electronic media.

They unabashedly violated two rules of logo and print media design, and it's so blatant that I can't believe it was an accident. Their logo features a gradient (print-media epic fail), and two low-contrast overlapping colors, the P and the background square (also a print-media epic fail).

I hope that the executives do not knee-jerk a reaction and demand a logo redesign, but instead play out the campaign and see how it pans out. I'm not convinced it was a mistake.

Perhaps the real redesign wasn't the logo, but their website and online presence?


Many engineers are not designers, and should not be judged as such. Constructive criticism is more helpful here than lambasting them for not adhering to your personal standards of UX.


While I agree with your second statement, your first is misplaced. There is a huge difference between "I'm an engineer, can't design, here's my resume(even my HTML5 resume)" and using oddly timed fade-ins, slides, text reflection, etc. etc.

If you try to put that much design in a resume, you get judged as a designer.


2 days late; not sure if you'll still get a chance to read this. But, I'll bite.

As someone who's hired (good and bad) engineers in both UI/UX and back-end disciplines, my first impression of it was "Yeah, it's not pretty, but he does not allege to be a designer." At that point, I checked out the source code. It was not spectacular, but he did communicate an working grasp of the technology he professed to understand.

If this resume was judged in a biased lean towards UI/UX, my opinion is that you'd be passing up a potentially hard-working and dedicated employee. With a little help from a designer, this guy could possibly do great things.


I'd argue that engineers should only "design" so much as they feel comfortable. If you have a decent sense of laying things out, stick to a nice minimalist grid. If you feel you have a decent idea of what fonts look great together, contrast serif and sans-serif fonts, sizes, weights, etc... and there ya go.

If you're just throwing a gradient here, a border there, a glow there, etc... then you're just shooting blind.


Everyone who tried to convince me that (vim|emacs) is better than (vim|emacs) was wrong


CKEditor and TinyMCE have solved a very difficult problem in extremely elegant ways; their developers deserve more respect than wishing their projects death.


Both of those editors rely on the contentEditable support of the browser to power the actual editing, and browsers have traditionally done a very poor job at producing valid HTML. They make attempts at "post-validation" and "attribute scrubbing", but these are at best makeup on a gorilla. I certainly wouldn't call their solutions elegant.

Having worked on numerous content management systems in the past, one of our biggest issues was trying to "lock down" the WYSIWYG editor so as to minimize the chances that clients could inadvertently break the layout when editing. It was a huge pain point for us, and I lost count of the number of times I had to go in and fix broken pages.

I'm very glad to see that the state of the art in HTML editing is moving forward, and it's well past time for CKEditor and TinyMCE to be put to pasture, IMHO.


Same problem here: the one feature that all these editors lack is the ability to restrict what kind of formatting that the users can create.

In a system I've created for a client, some users are allowed to use e.g. bold and italic, but most users aren't (because they would go overboard and make every other word bold). Post-scrubbing the edited html often breaks the editing experience, and browsers have lot's of bugs in contenteditable.


Hi all, we try to address this issues (bugs of the browsers and functionally to restrict to defined formatting) with Aloha Editor. For ex. we have more than 80 unit tests for the ENTER button. One of those is hitting the ENTER in a H1 Tag. The current is a release in an early development state and we want to focus on reliability and basic functionality (be able to write without errors) in first step.


A big problem is that contentEditable has never been part of an official spec until now (HTML5). My fingers are crossed that we'll see some improvements over the next few years.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: