> SQL Server has to convert every single value in the column to nvarchar before it can compare.
This of course is not true. It is a defect in Microsoft’s query planner. And the proof lies in the remedy.
The recommended solution is to convert the search argument type to match that of the index. The user is forced to discover the problem and adjust manually. SQL Server could just as well have done that automatically.
No information is lost converting nvarchar to varchar if the index is varchar. If the search argument is ‘’, no conversion from varchar will match it (unless the index data is UTF8, which the server should know).
This is a longstanding bug in SQLserver, and not the only one. Instead of patting ourselves on the back for avoiding what SQL Server “has to do”, we should be insisting it not do it. Anymore.
I’m not sure why the top-rated reply begins by presuming anything about the problem domain. Many domains have a specified language and implied if not explicit collation. Rejecting characters outside that domain is part of the job. There are no emojis listed on the NASDAQ.
The program you used to leave your comment, and the libraries it used, were loaded into memory via mmap(2) prior to execution. To use protobuf or whatever, you use mmap.
The only reason mmap isn’t more generally useful is the dearth of general-use binary on-disk formats such as ELF. We could build more memory-mapped applications if we had better library support for them. But we don’t, which I suppose was the point of TFA.
Entire libraries are a weird sort of exception. They fundamentally target a specific architecture, and all the nonportable or version dependent data structures are self describing in the sense that the code that accesses them are shipped along with the data.
And if you load library A that references library B’s data and you change B’s data format but forget to update A, you crash horribly. Similarly, if you modify a shared library while it’s in use (your OS and/or your linker may try to avoid this), you can easily crash any process that has it mapped.
Not really. The entire point of the article is that there are a lot of problem domains where data stays on a single machine, or at least a single type of machine.
Both of which are besides viability. It's just a usable system that gets you an idea of how an OS works when it's Lisp all the way down. It didn't invent this idea, it's just a modern example of it.
If only there was one good library. libxml2 is the leading one, and it has been beleaguered by problems internal and external. It has had ABI instability and been besieged by CVE reports.
I agree it shouldn’t be hard. On the evidence, though, it is. I suspect the root problem is lack of tools. Lex and yacc tools for Unicode are relatively scarce. At least that’s what’s set me back from rolling my own.
iRobot’s largest creditor isn’t its Chinese supplier. It’s the US government, in the form of unpaid tariffs, some $3.5 million. Arguably it was Trump’s stupid tariffs that drove the company out of business. Rather than bringing manufacturing to the US, it allowed the Chinese to acquire an American company, leaving production right where it is.
There is no pass-by-value overhead. There are only implementation decisions.
Pass by value describes the semantics of a function call, not implementation. Passing a const reference in C++ is pass-by-value. If the user opts to pass "a copy" instead, nothing requires the compiler to actually copy the data. The compiler is required only to supply the actual parameter as if it was copied.
This might be true in the abstract but it's not true of actual compilers dealing with real world calling conventions. Absent inlining or whole program optimization, calling conventions across translation units don't leave much room for flexibility.
The semantics of pass by const reference are also not exactly the same as pass by value in C++. The compiler can't in general assume a const reference doesn't alias other arguments or global variables and so has to be more conservative with certain optimizations than with pass by value.
Unfortunately "the compiler is required to supply the actual parameter as if it was copied" is leaky with respect to the ABI and linker. In C and C++ you cannot fully abstract it.
Note well: the claims about TCP come with some evidence, in the form of a graph. The claims for QUIC do not.
Many of the claims are dubious. TCP has "no notion of multiple steams"? What are two sockets, then? What is poll(2)? The onus is on QUIC to explain why it’s better for the application to multiplex the socket than for the kernel to multiplex the device. AFAICT that question is assumed away in a deluge of words.
If the author thinks it’s the "end of TCP sockets", show us the research, the published papers and meticulous detail. Then tell me again why I should eschew the services of TCP and absorb its complexity into my application.
Even the TCP graph is dubious. Cubic being systematically above the link capacity makes me chuckle. Yes bufferbloat can have cubic "hug" a somewhat higher limit, but it still needs to start under the link capacity.
There was never any danger of public education, so eliminating that danger was quite easy. What we are undermining, though, is the benefit of public education. Witness the last election, where tens of millions were indifferent to democratic governance if it meant cheap gasoline and eggs.
And, yes, the assault on democracy is real. On January 20, Trump signed an order in support of free speech. Within a week he barred the AP over the Gulf of America. Within a month he illegally disbanded USAID. Within 3 months he began suing law firms and defunding university research. Today colleges are receiving letters demanding curriculum in exchange for funding. And we have four years more, at least, to endure.
For those who have changed the world to what it is today, and want that but more, I'm sure public schools have always been something to celebrate. I am not one of those people, however.
>though, is the benefit of public education.
That benefit, even when I treat it with the most generous interpretation, is gone and has been for awhile. People whose children attend public school do not benefit from this, their children are being shut out of the economy in favor of bringing in workers from other countries. The political apparatus benefits, if those children are indoctrinated to vote correctly even as they grow up to only be fit to get a 20-hour part time job at Starbucks. Even now, you're worried about the politics in this very comment, you don't really care that those children won't grow up to earn a viable livelihood.
>Within a month he illegally disbanded USAID.
Oh noes! I too wish that the United States would spend millions and billions on foreigners in foreign countries. Won't anyone think of the CIA soft power we're losing?
This of course is not true. It is a defect in Microsoft’s query planner. And the proof lies in the remedy.
The recommended solution is to convert the search argument type to match that of the index. The user is forced to discover the problem and adjust manually. SQL Server could just as well have done that automatically.
No information is lost converting nvarchar to varchar if the index is varchar. If the search argument is ‘’, no conversion from varchar will match it (unless the index data is UTF8, which the server should know).
This is a longstanding bug in SQLserver, and not the only one. Instead of patting ourselves on the back for avoiding what SQL Server “has to do”, we should be insisting it not do it. Anymore.