Hacker Newsnew | past | comments | ask | show | jobs | submit | rooundio's commentslogin

Reminds me of this Science article in 2017:

http://science.sciencemag.org/content/357/6354/880.2#1504269...

on why scientists need social media influencers …


I am not associated with that company, but the nomatic (original) backpack completely transformed my travel experience to something hassle free - I got it years back right in the aftermath of their kickstarter campaign.


Switched to https://refind.com instead of bookmarks.


How does this work? The name and the promise is great but I'm skeptical.


Unfortunately true for most scientific communities and the reason why the current academic system is badly broken and talented young minds decide to quit academia before even submitting a dissertation.


Talented young minds are mostly quitting because of terrible pay, inexperienced/cruel supervisors, long hours, gambling on tenure, and opportunities in private industry.

The groupthink and publishing pressure are also problems, but they're very secondary to hellish career conditions.


The reasons you raise are very true (I have been there), but most of them are mere symptoms of something deeper. I believe the root causes are exactly the kind of social dynamics discussed in the article.

Terrible pay / other opportunities: Even at Ass. Prof. level pay is not so terrible. Sure, you won't get rich, but for most in academia that never has been a goal and would not matter.

Long Hours: That also mostly would not be a problem, as long as you would be able to do what you truly like.

Cruel supervisors: Here it's getting interesting. Why do you think that is? My hypothesis is that people become cruel if they don't like what they do. They basically become grumpy old men at the age of 40, because they always have to fight. Fight for money, fight against reviewers, fight internal department politics. Supervisors become cruel because they secretly hate their jobs, or better: the jobs the system makes them do.

Gambling on Tenure: If you have the "right" supervisor in the "right" field (read: a good networker in the current hype topic) your chances are actually not that bad. But as soon as you do something that is either currently not on the radar of the community, or you do something that is even controversial, you get yourself into a fight - which as a young researcher you will lose. In academia you can only make it, if you are well connected and the community carries you there. Group think again. After years in the system I believe it all boils down to that: social dynamics and group think.


Academia was always broken. Despite the current undeserved reputation as academia being open to debate and new ideas, the opposite has always been true. Academia has its sordid history of dogmaticism and hero-worshipping. And progress in knowledge usually advanced in spite of academia rather than because of it.

Go read about the battles between Newton and Hooke or even Newton and Leibniz.

The doctor who first suggested washing hand to cut down on infections in the 1800s was ridiculed and mocked by surgeons.

Scientists who brought up germ theory were mocked as quacks by doctors and scientists who rigidly adhered to the miasma theory as gospel.

Of course white supremacy was accepted "scientific fact" for a very long time by academia and anyone who thought otherwise would have been viewed similarly to a flat earther today. You could argue that white supremacy would still be "scientific fact" were it not for ww2.

My personal favorite is Georg Cantor who was mercilessly attacked by his fellow academics within math and even without for his theories on infinite numbers.

"The objections to Cantor's work were occasionally fierce: Leopold Kronecker's public opposition and personal attacks included describing Cantor as a "scientific charlatan", a "renegade" and a "corrupter of youth".[8] Kronecker objected to Cantor's proofs that the algebraic numbers are countable, and that the transcendental numbers are uncountable, results now included in a standard mathematics curriculum. Writing decades after Cantor's death, Wittgenstein lamented that mathematics is "ridden through and through with the pernicious idioms of set theory", which he dismissed as "utter nonsense" that is "laughable" and "wrong".[9][context needed] Cantor's recurring bouts of depression from 1884 to the end of his life have been blamed on the hostile attitude of many of his contemporaries,[10] though some have explained these episodes as probable manifestations of a bipolar disorder"

https://en.wikipedia.org/wiki/Georg_Cantor

"charlatan", "renegade", "corrupter of youth". Isn't it interesting how academics attack each other like religious people attack each other? The odd thing about some of the attacks on Cantor is that it came after his death and after he had PROVEN his countable and uncountable infinites.

The history of academia is as nasty and vicious as any other institution. And its gatekeepers and heroes as petty and pathetic as any you'd find anywhere else.


Let's not forget Ludwig Boltzmann, who derived the thermodynamic properties of gases assuming the existence of atoms:

' In 1904 at a physics conference in St. Louis most physicists seemed to reject atoms and he was not even invited to the physics section. Rather, he was stuck in a section called "applied mathematics" '

Things did not end well for Boltzmann [1].

And my favourite, John Bell, who was the first to understand the consequences of entanglement, in 1964. These ideas were met with derision by the mainstream for decades (citation needed.)

[1] https://en.wikipedia.org/wiki/Ludwig_Boltzmann


Funny how words meanings change in time. The original Plato's Academia vs today's academia. Socrates was a "corrupter of youth" too. The word "supremacy", as in "quantum supremacy" is much discussed now but why would anybody risk his career to complain about the alliance of ACM with Elsevier against Open Access?


> My personal favorite is Georg Cantor who was mercilessly attacked by his fellow academics within math and even without for his theories on infinite numbers.

I do not think this is comparable to other examples. Math is not science in the sense that its correctness is not determined by the outside reality, only by its internal consistency.

Different (finite vs infinite) axiomatizations leads to different classes of math structures, so it is only a matter of custom which structures are worthy of studying and how these 'leading' structures are defined.

And there is a point that if one studies countable structures (e.g. arithmetics or graph theory) then using arguments from infinite set theory (e.g. ZFC) is overkill. We do not know whether such theory is consistent and if it is not, then most likely much simpler theories covering the countable structures would still be consistent.


> so it is only a matter of custom which structures are worthy of studying

Or in other words, which proof steps are considered valid.


> Isn't it interesting how academics attack each other like religious people attack each other?

Maybe it is if you don't know that academies started as religious institutions, but knowing the history I would rate it as an amusing anecdote.



bullish on this as well


6) believing your technical opinion matters -- I've seen way too many VP's making technical decisions that they are too far from the work to make, trust your team!

This! It needs management people who understand their role as being "facilitators", rather than being "decision makers". And they are kind of rare.


My "functional programming epiphany" came in a talk by Martin Odersky, who remarked that imperative programming is like thinking in terms of time, whereas functional programming is like thinking in terms of space. Don't think about the sequence of how to achieve things, but the building blocks needed to do so. That nailed it and made me a Scala convert ever since.


Hickey himself based clojure on a different interpretation of time (place vs value). He talked about it a a conf long ago.

Going away from falsely linear time and shared mutation is one strength. You can rely on what has been assumed much stronger. Very good for concurrency.


That talk was "The Value of Values," which I enjoyed as well.

Links for anyone interested:

https://www.infoq.com/presentations/Value-Values (1 hour version)

https://www.youtube.com/watch?v=-6BsiVyC1kM (1/2 hour version)


That's a great talk and has had a lot of influence. It's specifically referenced in Project Valhalla's proposal: http://openjdk.java.net/jeps/169


Haa good catch, that's a strong sign if Oracle/Java is using it as an inspiration.


The epochal time model of Clojure was an enlightenment moment for me when I was learning Clojure. I now sorely miss it when I have to work in any other language :)


The book "Learn you a Haskell for great good!", Which is in my opinion an essential read for someone wanting to get into functional programming, describes it as "imperitive is when you tell the computer a sequence of operations to get a result. FP Is when you tell the computer what thingsare.

I think it should basically be required reading for any programmer, it's a very easy to follow look into the functional paradigm. It's also free to read online! http://learnyouahaskell.com


I own a copy and can't help but disagree. It goes over a few things like list comprehensions, types, etc, but I couldn't even stumble through writing a basic Haskell program when finished.I'm looking forward to Manning's Grokking Functional Programming if it ever comes out. The Haskell Book is also popular these days.


Really? It gave me a great grasp of the language and syntax and I was able to write some basic things immediately after, and anything I didn't know how to do (like communicate over a socket, etc.) I could google search.


Highly recommend http://Haskellbook.com


Why not think both in terms of space and time?



I mean, thats what imperative programming ends up being to some extent. But the idea is that the less things you can think about, the better.


Clojure core.async!


Best example I heard was in an F# talk. The guy used a bar tending analogy:

FP => I'll have a Sazerac

Imperative => Excuse me sir, could you take some ice, add rye whiskey, add bitters, add absinthe, shake, strain into a glass, and add a lemon garnish before bringing it to me


Well, isn't this nice - outsourcing the knowledge what makes Sazerac, and how to make it to somebody else, and just declaring that you want it?

Would you mind actually making Sazerac in your FP "analogy" as well?


I like Clojure, but stopped using it due to not having a good way to define DTOs at a service level (I prefer noisy statically typed languages apparently). Best guess.

  (-> {}
    ice
    (rye :2-fingers)
    bitters
    absinthe
    shake
    strain
    garnish)
Now I think that strain flipped the returned type from drink to a glass with the drink.

All this shows is that OO and FP are duals [1]. I don't claim to get FP perfectly, but my moment of zen was realizing this.

1 - http://wiki.c2.com/?ClosuresAndObjectsAreEquivalent


Thank you, I had not seen that c2 page. Many in the JavaScript community have fervent arguments for/against closures/objects (which I do not share). The educated debate in that link is a quality resource on the subject.


forgot the sugar!


Haha...didn't think so many people would respond. There is a Lisp example below. F# is similar, but with pipes and arrows.


Imperative (especially Java-style OO-imperative) programming will just about always win over non-OO declarative programming in an analogy like that (i.e. a Simulation of a real world process) by nature of them being focused on step-by-step "world manipulation" (and object interaction in case of OO).

The value for FP comes from proper abstraction over these processes in functional terms, at which point they can be trivially implemented (few bugs, few iteration cycles to get right). This can probably be done for every problem space, question is at which the abstraction costs outweight the gain. Considering that FP becomes more and more mainstream it's probably more viable than thought in the past, still I imagine system-driven games or complex real-world simulations, with lots of side effects, would lose more from FP than they'd gain.


> Imperative/OO [...] programming will just about always win over non-OO declarative programming in an analogy like that (i.e. a Simulation of a real world process)

Some time ago I thought that, too, but I'm no longer convinced. Directly mutating values (OO style) feels more natural at first, but then you have trouble with side effects and order of execution matters more than it should, you start keeping snapshots of the whole state just to get a consistent world state during computation, otherwise this whole mess produces a whole class of bugs on its own. These problem drive you more and more into FP direction, and the FP style definitely does have its merits in this regard.

I think the following article articulates this very well:

"A Worst Case for Functional Programming?"

http://prog21.dadgum.com/189.html


FP is more like: drink(sazerac(garnish(strain(shake(absinthe(bitters(whiskey(ice(glass)))))))))

Ultimately it's the same result? The difference is when you can reuse and compose functions.


Exactly.

FP you start with the methods and just keep composing. With OO you start with classes and objects.


In OOP you compose nicely as well. It is also easier to understand because it is just conversation.

Me(Drinker)->drink( Sazerac(Cocktail) ->garnish() ->strain() ->shake() ->absinthe() ->bitters() ->whiskey() ->ice() ->glass() )


I would probably use threading to achieve comparable readability.

  (def sazerac 
    (-> (mix :ice :whiskey :bitters :absinthe)
        shake
        strain
        garnish))

  (drink :me sazerac)
I never had a Sazerac. It sounds like a nice drink.


I would create a Cocktail data structure and an instance of Monad for it.

    sazerac = do
        add ice
        add ryeWhisky
        add bitters
        add absinthe
        shake
        strainInto glass
        add lemonGarnish

    main = serve $ makeCocktail sazerac


Haskell really is a pretty nice imperative language sometimes.


That isn't composition to me but sequencing. Function composition yields a function, not the result of applying the functions.

Either you have an object with all of garnish(), strain() and whatnot on it, or each method returns the object to handle the next step in the chain. Either methods don't scale at all without modifying existing code.

The real difference is that function composition gives you a reusable function you can further compose while objects keep piling up methods until you're left with god objects or indirection hell.


And people have the cheek to complain about parens in Lisp...


Its ironic because foo() and (foo) have the same number of characters; but the later is actually data you can manipulate directly.

Reminds me of the blub paradox in beating the averages[1].

[1]: http://paulgraham.com/avg.html


I once had a comment where I translated a clojure function into it's equivalent syntax in python. It was still pretty hard to read. I think its about how lisp uses function composition for everything makes code hard to parse until you get good at it. Even with the standard practice is to hide it with macros and many small functions.

https://news.ycombinator.com/item?id=11174946#11177360


I like more forward-building, pipe style FP. Reading outwards on the statement level still makes things hard for me.


fluent interfaces from OOP are different than composition because they mutate-and-return the object reference; if the instance is aliased anywhere else in the program there is spooky-action-at-a-distance. Composition is about programming with values without mutation. As far as syntax goes it is a trivial macro to translate `x.f().g()` into `g(f(x))` (clojure actually provides it : http://clojure.org/guides/threading_macros)


> they mutate-and-return....

Not necessarily; you can easily write instance methods which merely copy the existing object.


If you want to create a bar food (Finger Food object, not Cocktail), which also could use a garnish() method, which inherits from which? Or should both objects have a father, Garnishable object to inherit from? It's clear to me that object composition is less flexible than functional composition.


You can have trait/interface. But since garnish will be doing different thing it is OK to have two implementation. FP looks nice in theoretical examples in real world not so much.

In FP you will end with garnishFingerFood and garnishCocktail because you need to encode somewhere a specifics of garnish action. In OOP you will have garnish methods on Coctail and FingerFood and specifics and related knowledge how you need to perform garnish will be on object itself.

OOP is really powerful concept but failing in languages that have shit implementation. Java, C++ forsake OOP principles for "performance" or are made by people that do not understand concepts (Python, PHP).


From the specific example you're presenting, I don't see how you couldn't just have a Garnishable typeclass, that implements garnish differently for the FingerFood or Cocktail types.

The comment that FP isn't nice in the real world is pure baloney. For lots of "real world" IO-bound types of problems there is nothing better suited than a functional programming language with powerful abstractions. Things like monads let you write code in an imperative style without losing any of the benefits of writing in the functional paradigm.


OOP doesn't actually have the market cornered on polymorphism. As my sibling replies show, FP languages have long known how to achieve it.


> In FP you will end with garnishFingerFood and garnishCocktail because you need to encode somewhere a specifics of garnish action. In OOP you will have garnish methods on Coctail and FingerFood and specifics and related knowledge how you need to perform garnish will be on object itself.

You're complecting. In the real world of Clojure, you could simply define a protocol and provide different implementations of "garnish".


You just have the method as an aspect that you import into the class ;)


Generic functions: functional yet dynamically dispatched.


Generics aren't dynamically dispatched. Java has type erasure and C# emits variants for every instance.


No, I was talking about (Common) Lisp's generic functions (http://clhs.lisp.se/Body/m_defgen.htm). I realize I did not give enough context. Define a generic function:

    (defgeneric garnish (what with-what))
Specialize it on one or multiple arguments:

    (defmethod garnish ((c cocktail) (f fruit)) ...)
    (defmethod garnish ((s sandwich) (h ham)) ...)
    ...
But you can use it like a function:

    (let ((currified (rcurry #'garnish :curry)))
      (map 'list currified items))


Ah my bad! :)

These look very much like Clojure's protocols and multi-methods.


Yes, more like multi-methods, except for standard qualifiers (:around, :before, ...) and user-defined ones. OTOH, Clojure allows you to define a custom dispatch function.


Ah yes, Clojure has no first-class support for aspects. Its easy enough to monkey patch definitions for the rare case when its needed.

Most of the time however I try to avoid aspects as they introduce hidden behaviour in existing functions, which can be hard to reason about at scale, especially with more than a few developers.


I use them all the time. It's a great way to separate responsibilities.


I prefer function composition for that when possible :)

Aspects I feel are more useful when you want to modify the behaviour of existing code you don't own.

Maybe if I had easy access to them I'd find more use cases :)


The example is just about abstraction.

The imperative equivalent would be:

"Give me a Sazerac!"

(imperative, hehe)


"It is imperative that you give me a Sazerac!"


Some time ago at University, we had to develop a puzzle solver in FP. In the report's introduction, I wrote something like "In this project, we must code imperatively in the functional programming language OCaml". The joke was well received.


First of all, that's not how you make a Sazerac. You rinse the glass with absinthe and toss the absinthe, you don't add it to the mix. There's also an ordering dependence: the rye and bitters can be mixed in either order, but adding ice, shaking, and straining should happen in that order with nothing in between, because the longer the warm ingredients are in contact with the ice, the more you're watering down the drink. I'm not going to go so far as to say the lemon garnish is wrong, but an orange peel rubbed on the rim and then garnished is better IMHO.

Second, that's not functional programming, that's calling a library function.


I think a better analogy would be:

FP => I'll have a Sazerac

Imperative => Serve me a Sazerac

Imperative programming isn't devoid of abstractions; it just has different ones.


tl;dr "Classical" (natural) scientist: "X is only understood once I found first principles that x can be reduced to." "Modern" scientist: "X might be so complicated that there are no first principles that x can be reduced to. Rather, finding a neural network that can do x is the best I can do, and it explains why x can be done."


I think you should delete the "first" in both cases. Chomsky merely wants a principled model. He's not making any pretense that we're anywhere close to understanding language at a fundamental level. Norvig isn't really interested in principled models at all.



Gotham by Tobias Frere-Jones. A very similar open source alternative is Montserrat by Julieta Ulanovsky


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: