Wow, the rare double penalty. Loss of 5 yards for stooping to a Christmas song for worst song of all time debate, loss of an additional 20 yards for going with a post-Beatles McCartney track. This may be internet bullshit, but by god we have rules.
"Do They Know It's Christmas" usually makes the lists of terrible Christmas music as well.
So much Christmas music is repetitive, but coloring outside the lines is a tough thing to do with Christmas music. The Pogues nailed it with "Fairytale of New York", but it largely appeals only to GenX and younger.
I will literally leave a store if this is playing.
But also, the fact that you are damn near guaranteed to run into this song _yearly_ qualifies it to be way higher up than any other song on a "worst song" list (even if it weren't already at #1).
Aaagh you monster it's only November and now that horrible aimless effected wobbly chord fever dream is stuck in my head again. I had at least another couple weeks before hearing it for the first time this year.
I never understood how something could be so bland and yet so revolting at the same time, like a smoothie made of wallpaper paste and dog shit.
You've got to give credit to the genius of McCartney's musicianship, that songs perfectly encapsulates the feeling of being stuck in a nightmare you can't escape from.
As a musician, I sometimes make music that I don't like, but just have to get out of my head, feeling compelled to finish a track just to put that pesky musical idea to rest. Recording a track is akin to closing the lid on the coffin; there's nothing more to do, and you can move on.
I wonder if Paul had that experience with that song. The nightmarish chord line is catchy. So he got it out of his head by turning it into a track...
...and ended up getting it in our heads.
I wonder if he thinks "You complain? You can just turn it off. I had to live with that song!". We should be grateful for only being exposed to it once a year, along with the seasonal flu.
How does he sing out of tune on his own darn song? If he's not out of tune, those awfully strained high notes shouldn't have been written in whatever weird key they're in.
Yeah, I mean it's pretty silly to declare something "the worst song of all time" - everyone has their personal taste and their own personal songs they love to hate. For me, it's (of course) Wham's Last Christmas and I Will Always Love You (sorry Whitney, but back in the 90s when I was watching a lot of MTV, listening to a lot of radio, and ads for Bodyguard were all over the place, I seem to have developed an allergy to this song).
And BTW, I wouldn't even call this the quintessential 80s song, for me that's Stevie Wonder's I Just Called to Say I Love You, with that cheapo synthesizer bleeping along in the background. Oh well, to quote Calvin Harris, it was "acceptable in the 80s"...
That's actually tied with Lennon's "Happy Xmas (War is Over)". Listen carefully and you can hear Yoko Ono screeching incredibly far out of tune as the piece climaxes. It's not in the forefront of the song, but it's there, and it's stuck out so much to me ever since I first noticed it.
If this song isn't proof that Paul McCartney is a psychopath, I don't know what is. It is absolutely the most insipid, uninspired piece of garbage ever to be recorded.
Honestly, I find its wackiness quite refreshing. I will always take it over "All I want for Christmas", "Driving Home For Christmas" or "Last Christmas".
Herein lies the tragedy. Google could've offered, even sold, its internal development experience (code hosting, indexing and searching, code reviews, build farms, etc...) which is and was amazing, but it decided that it wasn't worth doing and let GitHub eat its lunch.
Developer infrastructure at google reported into cloud from 2013 to 2019, and we (i was there) tried to do exactly that: building products for gcp customers based on our experience with building interval developer tools. It was largely a disaster. The one product I was involved with (git hosting and code review) had to build an
MVP product to attract entry level GCP customers, but also keep our service running for large existing internal customers, who were servicing billion+ users and continuously growing their load. When Thomas Kurian took over GCP, he put all the dev products on ice and moved the internal tooling group out of cloud.
Since I experimented with something similar in the past to mimick multidimensional arrays: depending on the implementation this can absolutely _kill_ performance. IIRC, Dash does a linear lookup of variable names, so when you create tons of variables each lookup starts taking longer and longer.
We haven't found this to be an issue for Pnut. One of the metric we use for performance is how much time it takes to bootstrap Pnut, and dash takes around a minute which is about the time taken by bash. This is with Pnut allocating around 150KB of memory when compiling itself, showing that Dash can still be useful even when hundreds of KBs are allocated.
One thing we did notice is that subshells can be a bottleneck when the environment is large, and so we avoided subshells as much as possible in the runtime library. Did you observe the same in your testing?
> We haven't found this to be an issue for Pnut. One of the metric we use for performance is how much time it takes to bootstrap Pnut, and dash takes around a minute which is about the time taken by bash. This is with Pnut allocating around 150KB of memory when compiling itself, showing that Dash can still be useful even when hundreds of KBs are allocated.
Interesting. When you say "even when hundreds of KBs are allocated", do you mean this is allocating variables with large values, or tons of small variables? My case was the latter, and with that I saw a noticeable slowdown on Dash.
Simplest repro case:
$ cat many_vars_bench.sh
#!/bin/sh
_side=500
i=0
while [ "${i}" -lt "${_side}" ]; do
j=0
while [ "${j}" -lt "${_side}" ]; do
eval "matrix_${i}_${j}=$((i+j))" || exit 1
: $(( j+=1 ))
done
i=$((i+1))
done
$ time bash many_vars_bench.sh
5.60user 0.12system 0:05.78elapsed 99%CPU (0avgtext+0avgdata 57636maxresident)k
0inputs+0outputs (0major+13020minor)pagefaults 0swaps
$ time dash many_vars_bench.sh
40.75user 0.14system 0:41.22elapsed 99%CPU (0avgtext+0avgdata 19972maxresident)k
0inputs+0outputs (0major+4951minor)pagefaults 0swaps
Dash was ~8 times slower. Increase the side of the square "matrix" for a proportionally bigger slowdown (this one uses 250003 variables).
> One thing we did notice is that subshells can be a bottleneck when the environment is large, and so we avoided subshells as much as possible in the runtime library. Did you observe the same in your testing?
Yes, launching a new process is generally expensive and so is spawning a subshell. If the shell is something like Bash (with a lot of startup/environment setup cost) then you'll feel this more than something like Dash, where the whole point was to make the shell small and snappy for init scripts: https://wiki.ubuntu.com/DashAsBinSh#Why_was_this_change_made...
In my limited testing, Bash generally came out on top for single-process performance, while Dash came out on top for scripts with more use of subshells.
I'm writing something similar, but it's based on its own scripting language. The idea of transpiling C sounds appealing but impractical: how do they plan to compile, say, things using mmap, setjmp, pthreads, ...? It would be better to clearly promise only a restricted subset of C.
One train can move many people at once (especially for peak commute times when people would be driving single-occupancy cars just to get to an office). It suffer from similar problems, but it's much more efficient at the same task.