> If you take those genes and put them into organisms like bacteria and yeast, which normally do not have these proteins, they actually become much more desiccation-tolerant
Well I have a newfound fear of humans creating a bacteria as hard to kill as a tardigrade…
I mean all we need now is to find out how to make a prion that forms TDPs so we can glue it to the outside of COVID-19 in some twisted gain of function research.
This is such a sad exemplar of where we've gone as a technological society. Thirty years ago, you could have suggested releasing a virus that destroyed cancer, and people would have cheered.
I think people have just had to become a bit more realistic after having their eyes opened. Right now my imagination says that this virus might destroy cancer, but who knows what else it's going to do?
This reasoning is like the fear of a some random TESTED C++ code destroying the internet, because "it might do things that you don't expect in the long run". Well yeah... it might... but...
> imagine a protein that locks onto pieces of DNA that are only present in people of a certain race
The good news is that "race" is a social construct, not a genetic one. Very little genetic variation differentiates between groups of people, and the differences that exist do not map onto socially recognized categories of race.
Won't theoretically stop someone from discovering the genetic signature for a certain natural skin pigmentation, or slanted/smaller eyes. Sure it won't be perfect, but people who would create and use such a weapon probably wouldn't care about collateral damage.
I can certainly see it being China's final solution to the Uyghur question. They have a long history of callously and needlessly killing large swaths of their own people toward nationalistic ends, and not just under Communist rule.
There is more genetic variance within "races" than between them. Your imagination is neither necessary nor sufficient for it to be possible, let alone feasible.
I also honestly do not understand why China is the bogeyman here given that Long Island, NY was the eugenics capital of the world within living memory. Utah, the state in which I live, didn't shutter its own eugenics program until the 1960s. I'm trying to recall when China dropped nuclear ordinance on civilian populations and really failing here.
I figured it's probably a western bubble thing and sure enough most people in most countries don't view China as the obstacle to world peace. https://fullfact.org/news/america-world-peace/
China is the bogeyman here because they are the only major economic power currently running a large-scale state-sponsored genocide, unless you count Russian attempts in Ukraine. They're also one of the few nations with the scientific expertise and resources to potentially create such a virus.
And just because we haven't figure it out doesn't mean it doesn't exist. If you take a baby born to two sub-Saharan African parents and a baby born to two Northern European parents, and keep them in boxes with identical stimuli from birth, they will have drastically different skin tones. If that's not genetic, then what is it?
> the genetic signature for a certain natural skin pigmentation
Such a thing does not exist:
> Our understanding of the genetics of melanin formation and distribution within cutaneous and follicular tissues has recently greatly expanded through the power of genome-wide association studies (GWASs) using large databases such as those of the UK Biobank (22) and 23 and Me (51). This research has provided new insights into the biology of skin and hair color and underscores the highly polygenic nature of these two traits, with complex epistatic interactions apparent between the genes involved.
> People today look remarkably diverse on the outside. But how much of this diversity is genetically encoded? How deep are these differences between human groups? First, compared with many other mammalian species, humans are genetically far less diverse – a counterintuitive finding, given our large population and worldwide distribution...
> Early studies of human diversity showed that most genetic diversity was found between individuals rather than between populations or continents and that variation in human diversity is best described by geographic gradients, or clines. A wide-ranging study published in 2004 found that 87.6% percent of the total modern human genetic diversity is accounted for by the differences between individuals, and only 9.2% between continents. In general, 5%–15% of genetic variation occurs between large groups living on different continents, with the remaining majority of the variation occurring within such groups (Lewontin 1972; Jorde et al. 2000a; Hinds et al. 2005). These results show that when individuals are sampled from around the globe, the pattern seen is not a matter of discrete clusters – but rather gradients in genetic variation (gradual geographic variations in allele frequencies) that extend over the entire world. Therefore, there is no reason to assume that major genetic discontinuities exist between peoples on different continents or "races." The authors of the 2004 study say that they ‘see no reason to assume that "races" represent any units of relevance for understanding human genetic history. An exception may be genes where different selection regimes have acted in different geographical regions. However, even in those cases, the genetic discontinuities seen are generally not "racial" or continental in nature but depend on historical and cultural factors that are more local in nature’ (Serre and Pääbo 2004: 1683-1684).
It must. If you take a baby born to two sub-Saharan African parents and a baby born to two Northern European parents, and keep them in boxes with identical stimuli from birth, they will have drastically different skin tones. If that's not genetic, then what is it?
I'm not saying it wouldn't be an unbelievably complex signature, but it would be a signature nonetheless.
Naomi: FoxDie is a type of retrovirus that targets and kills only specific people. First, it infects the macrophages in the victim's body. FoxDie contains smart enzymes, created through protein engineering. They're programmed to respond to specific genetic patterns in the cells.
Snake: Those enzymes recognize the target's DNA?
Naomi: Right. They respond by becoming active, and using the macrophages, they begin creating TNF epsilon. It's a type of cytokine, a peptide which causes cells to die. The TNF epsilon is carried along the bloodstream to the heart, where they attach to the TNF receptors in the heart cells.
Snake: And then...they cause a heart attack?
Naomi: The heart cells suffer a shock and undergo an extreme apoptosis. Then... the victim dies.
This is actually super trivial with today's technology.
It's a little too late to "stop building the foundations". That's like wanting to "stop building the foundations" for the tech that allows governments to spy on civilians. We're about 20 to 30 years passed that.
P. aeruginosa is an incredible organism. Not only is it antibiotic resistant, but it also is starvation resistant. So even when you have antibotics that work, it's often to difficult to get them to all the cells, since cells at the bottom of a biofilm just shut down completely for weeks on end. End the course of antibiotics, and they pop back up and start growing again.
Thankfully, there are major tradeoffs associated with those traits, which makes them not particularly virulent to healthy people.
That feels like fatally flawed reasoning. Nature took billions of years to sequester carbon. It’s taken a few hundred to release an amount that’s permanently altered the climate of the Earth.
There’s lots of problems humans can create that “nature” couldn’t precisely because we drastically compress time scales that make adaptation exceedingly difficult.
I asked ChatGPT what it would respond and it produces something very similar, down to the same sentences, to the comment above. So this definitely looks like a ChatGPT response.
Since time stretches out to infinity as you approach a singularity, if we assume all matter started out compressed at the Big Bang, then actually it took forever amount of time for the universe to come into existence. I’m not a physicist though so I may have gotten the relativity aspect wrong.
Assuming the start was a singularity is an interesting idea. Unlikely though. Light speed is constant in all frames of ref, even in singularities. So if a bang was accompanied with a flash then the flash must preexist at a constant speed. Which makes no sense since we are saying light came into existence along with everything else at the ‘start’. The ‘start’ must have a fundamentally different nature, where even time came into existence. You see?
Climate has been changing since the planet exists and you benefit from everything that a modern society has to offer. Better if you become the example of change in society and go live inside a cave to reduce your own carbon footprint.
Did you even look at what I posted? It addresses what you're saying, and it shows the Medieval Warm Period on the graph. In fact, even the article you posted has a similar graph as the first image on the page.
The medieval warm period and following "little ice age" are not even visible on the xkcd graph, that's how small they are compared to massive recent changes. You're looking at micro trend with a magnifying glass and macro trend with a distorting lens if you think the medieval events are anywhere close to what's happening. No recorded climate change happened as fast as the current one, there have been wilder changes but they happened over centuries, not decades
Saying climate always changed so it's ok is like saying you'll die eventually so it's ok if you die tomorrow
> better understand that climate has always changed and will continue to change.
Little Ice Age is on graph and is labeled. Medieval warm period is mentioned but too regional to show up in global temps.
But Little Ice Age lasted centuries and was only .5C. I hadn’t realized it was still happening until global warming erased it in 150 years. Then we added another 1C in less than 50 years. We are on track to warm another 3C by 2100, the amount of warming from real ice age in century instead of millennium.
Yes, then look 3 cm down and see a massive fucking swing with 20 times more amplitude than both of the mentioned events...
I stand by what I said, without the labels you wouldn't see these two events, meanwhile even without the label you can see the current trend is absolutely nowhere close to any other past changes
Come on. You can both dig deeper than a single biased source. The world isn't turning into hell within a century, as certain as the sky won't fall on top of our heads.
If you are old enough, at least learn from the past two decades. Every single time that our goodwill was placed to believe the news drama, the end-result are overwhelming taxes on the west while forgetting about the major culprits of pollution in India and China.
Where are the XKCD man-child cartoons teaching about that?
Our society today doesn't behave that differently from medieval scientologism. In that sense humans are just like the climate, changing only ever so slightly.
(This is coming from someone with ADHD but not dyslexia)
It's harsh, but it isn't wrong, right? The reason that these states of being are considered 'disorders' is because they interfere with daily life and with tasks that other people consider to be easy.
I need to take a lot of time to edit my writing and to re-evaluate the content that I write because it is easy for me to slip into a multi-paragraph tangent. Fundamentally that happens because my thinking is not sharp/focused.
On the other hand, when I am done writing I get a feeling of great accomplishment. Like I was able to swim through the deep waters of my mind and harpoon all the good ideas. Being smart isn't worth much if you can't communicate your ideas!
speaking as someone with mild dyslexia and high ADHD. I would say that my analytical skills, abilty to focus, spatial awareness (I could go on here but I think you get the gist) are well above average based on my work/ university experience
i do take offense of people implying my "mind is weak" based soley on my writing skills. Yes, I know writing skills are important at communicating, but humans have seriously many skills to offer than just writing or even communicating for that matter. yet people seem to be condensing their opinion of other people into a very highlevel simplistic rule. Are we going back in time here? whats next? basing if peoples minds are weak on DNA/ race?
>I need to take a lot of time to edit my writing and to re-evaluate the content that I write because it is easy for me to slip into a multi-paragraph tangent. Fundamentally that happens because my thinking is not sharp/focused.
same here on the first part, but not because I can't focus, its because I have to juggle many train of thoughts whilst writing slowing into one stream of communication. to the point I give up on many things that i really want to say
You are seriously underestimating the importance and potential of communication. Excellent communication conveys expectations, specificity, and manipulation in ways you are not capable of imagining without so being skilled. It’s the difference between effective leadership and being a tool.
If you want to talk about personality the member of the big 5 most correlated with intelligence is openness. The member with the highest negative correlation to intelligence is conscientiousness. To be skilled in any kind leadership you need both high intelligence and super high conscientiousness. To be skilled in writing you need high openness and at least moderately high conscientiousness.
Irrespective of creativity and intelligence you need some amount of conscientiousness to be goal oriented. That is just yourself. To direct others to do the same you need that plus excellent communication skills. There are no shortcuts or alternatives here.
TIL about Normal Mapping, which to my layman eyes kind of looks like computing a hologram on the flat surface of an object. In the coin example in TFA, even though I now 'know' that the coin is a cylinder, the normal map gives it very convincing coin shape. Cool!
If you think that's cool, the next level, which really could be considered to behave like a hologram is "parallax mapping" and it's variant "parallax occlusion mapping".
Basically, in terms of levels of realism via maps, the progression goes
1. Bump mapping: the shader reads a heightfield and estimates the gradients to compute an adjustment to the normals. Provides some bumpiness, but tends looks a little flat.
2. Normal mapping: basically a variant of bump mapping -- the shader reads the adjustment to the normals directly from a two- or three-channel texture.
3. Parallax mapping: the shader offsets the lookups in the texture map by a combination of the heightmap height and the view direction. Small bumps will appear to shift correctly as the camera moves around, but the polygon edges and silhouettes usually give the illusion away.
4. Parallax occlusion mapping: like parallax mapping, but done in a loop where the shader steps across the heightfield looking for where a ray going under the surface would intersect that heightfield. Handles much deeper bumps, but polygon edges and silhouettes still tend to be a giveaway.
4. Displacement mapping: the heightfield map (or vector displacement map) gets turned into actual geometry that gets rendered somewhere further on in the pipeline. Pretty much perfect, but very expensive. Ubiquitous in film (feature animation and VFX) rendering.
So "classic" rendering, as per DirectX9 (and earlier) is Vertex Shader -> Hardware stuff -> Pixel Shader -> more Hardware stuff -> Screen. (Of course we're in DirectX12U these days, but stuff from 15 years ago are easier to understand, so lets stick with DX9 for this post)
The "hardware stuff" is automatic and hard coded. Modern pipelines added more steps / complications (Geometry shaders, Tesselators, etc. etc.), but the Vertex Shader / Pixel Shader steps have remained key to modern graphics since the early 00s.
-------------
"Vertex Shader" is a program run on every vertex at the start of the pipeline. This is commonly used to implement wind-effects (ex: moving your vertex left-and-right randomly to simulate wind), among other kinds of effects. You literally move the vertex from its original position to a new one, in a fully customizable way.
"Pixel Shader" is a program run on every pixel after the vertexes were figured out (and redundant ones removed). Its one of the last steps as the GPU is calculating the final color of that particular pixel. Normal mapping is just one example of the many kinds of techniques implemented in the Pixel Shading step.
-------------
So "Pixel Shaders" are the kinds of programs that "compute a hologram on a flat surface". And its the job of a video game programmer to write pixel shaders to create the many effects you see in video games.
Similarly, Vertex Shaders are the many kinds of programs (wind and other effects) that move vertices around at the start of the pipeline.
I very infrequently watch the news. it is obvious to me that major news outlets, at least in the US where I am familar, are definitely attempting to push political agendas and shape public opinion and manufacture consent.
However, Fox news is consistenly the only news source that I see which is explicitly trying to fearmonger, to use identity politics to pit people against each other, and is the only news source that allows airing people that advocate for the reduction or freedoms and rights of certain classes of people.
I can count on the rest of the news sources to be vaguely pro-police, pro-capital, pro-land-owner, pro-status-quo. There are negatives to having news be so favoring to the ruling elite. But on the topics of who gets to live with decency in the US, at least the other major news outlets agree that the answer is 'anyone with money'. The American dream. On fox news though there are 1000 ways to be the wrong kind of person who needs to be shamed and harmed and removed from society.
500%. Keeping people glued to the tube, sensationalism, is uncomfortably high many places.
But you're dead on right, there is nothing remotely close in equivalency. To the people Vs people hating & lies of one side of the media. There's coverage of issues & actions on one hand, & coverage about how fallen & bad everyone on the other side is from the not so honorable opposition.
Tucker Carlson's rebirth as a dark soul, after being shamed by John Stewart, is one of those dark bitter tales of humanity where someone, faced with crisis, tripled down on every bad aspect. There's a certain White House Correspondence Dinner that begat another shameless fall towards power, also of note.
IMO the "main source of inequality" is that tech allows a small number of people to use technological and fiscal leverage to make an outsized impact on society as a whole. Anyone who has a job that produces value in a 1:1 way is positioned to be 'disrupted'. NLP, etc, just provides more tools for companies to increase their leverage in the market. My bet is that GPT-4 is probably better at being a paralegal than at least some small number of paralegals. GPT-5 will be better at that job than a larger percentage.
Anyone who only has the skills to affect the lives and/or environments of the people in their immediate surrounding are going to find themselves on the 'have nots' end of the spectrum in the coming decades.
This is exactly what has happened to commercial and investment banking (market/trading) in the last 30 years. Computers and mass automation. Even if your profits only grow with inflation (in reality, they grow much faster), but you can reduce costs each year (less labour required), then return on equity continues to rise. It is crazy to me that most big commercial banks still have so many physical branches. I guess they exist for regulatory purposes -- probably _very_ hard to close a branch to avoid "banking deserts".
This has changed considerably. Chase remodeled most of their existing branches so they have like, 1 teller, and 2 or 3 people sitting at desks for other transactions. That's it. The days where your usual branch had a line of like 15 tellers are long gone. Out here in southern california, I think they also closed many of their branches in the past few years.
But looking back further - ohhhhhh yeah dude. Oh yeah. Totally. For a brief period my mom worked at a BofA facility that _processed paper checks_. Like they had a whole big office for it. That's completely 100% gone now. The checks get scanned at the point of entry (cash registers, teller counters, etc) and then shredded.
There's also a charitable way to read that. You won't change the world by resigning yourself to its current state; you have to reject its current state. Which isn't to say this is the next big thing, just that big ideas start by rejecting reality.
That's a pretty fair summarization of the past, but this is unlike any Apple product ever. IMO, there's nothing Apple can do that will make a significant number of people want to buy and strap on a $3k headset.
Right, and this is why they fundamentally failed. If their main depositors were traditional businesses and individual accounts they wouldn’t be an issue today. They invested their money as if they had low-risk depositors but a start-up is a high risk depositor——they don’t have a proven business model so it is hard to forecast their behavior.
To me this is classic SV echo chamber thinking. There’s a reason traditional banks are more wary of lending to start-ups and it isn’t because they hate money.
I hate to jump on the bandwagon here, but imagine if Instagram released an update to their app to make it look like it did in iOS7. People would uninstall the app. They would be laughed at.
It isn't because current design trends are 'better', but they are exactly that, trends. If you don't keep up your app will look dated. To my unexperienced eye, the iOS7 design looks like the buttons take up way too much space; they are too visually interesting and distract from the content. But that isn't an objective take, it is just that I'm used to the current trend.
I could write about how Members-only jackets and moonboots were the hight of usability and design, but they were just a passing fashion. Nothing is really gained or lost, just changed.
Perfect! Also imagine the UI designer suggesting something like iOS7 for their client, is simply not feasible.
I also think this is more about what is the current trend and this isn't exclusive to UI design. Clothes from the 1800-1900 also look cool, but no one is wearing because they're going to look weird to the eyes of everyone else.
Well I have a newfound fear of humans creating a bacteria as hard to kill as a tardigrade…