Applets didn’t fail because of missing JRE, it was included in IE. It was only after Microsoft started playing around with the implementation that Sun asked them to stop, which they did. Only then did a missing JRE become an issue.
> I do wonder though why Oracle doesn't seem interested
And then what? Everybody would write Java code and run their ... not quite "apps" but we can call it app-lets -- in the browser, so that "write once run everywhere" could be achieved?
Nooo! I spent a summer in college writing physics education applets. The scars are still there.
Seriously, I do wonder what the real value of some of this WASM stuff is. I mean, seems cool to run a java (or rust or <insert language here>) app in the browser, but what is the real world use case? If I run an app in the browser, I still have to do all the server side business validation because "you should never trust the client".
What am I missing?
Edit: on reading other comments, apparently this is direct-to-js compilation, not WASM. The intent of the comment still stands.
For my use case of building Filestash, WASM is a game changer in those 2 areas:
1. use libraries from ecosystem outside JS. In the past few months, I've added support for file types like psd, dbf, arrow, parquet and about 50 more. To give a concrete example: https://www.filestash.app/tools/parquet-viewer.html WASM open up a very exciting door. In the same idea, I stumbled upon a couple JAVA only libraries which I would love to ship on a browser and not have to create web service to interact with those.
2. enable third party to make plugin that can run in my app in a safe way. In my case, plugins are zip files containing a bunch of assets and the WIP piece is to be able to put wasm in there that will run server side without giving those plugins a blank check for acting crazy.
Every time I read about a purported "housing shortage" I'm reminded that there are about 140 million housing units in the US[0], with an average of 5.5 rooms per unit[1], or about 700 million rooms, all that for 350 million of population, or about 2 rooms per person.
This doesn't look like "we have a housing shortage". What we do have is a shortage of affordable housing in the megacities, and "it’s a rampant problem" in all the megacities.
If I'm reading myself right, I'm suggesting that there's no need to "the solution to housing shortage," since -- with more than 2 rooms per person on average -- it's not a problem to begin with. The problem frequently called "the housing shortage" is a problem of "housing affordability in the megacities," and we should call it by its real name.
How are those rooms distributed? It's not like they are individually moving parts.
People buy a house big enough to hold their kids, then they age and the kids move out, and there's lots of fully owned homes with empty rooms, but no places for the now-adult children to live, until a prior generation dies.
> "the housing shortage" is a problem of "housing affordability in the megacities," and we should call it by its real name.
Housing affordability problems are driven by a single thing: shortage of housing. Refusing to call the shortage a shortage and instead only referring to the symptom, inaffordability, rather than the cause, shortage is willful deception to prevent action on the cause.
This is not a problem just in megacities, it's spreading everywhere else in the country as the problem gets worse and worse. It showed up first in the most in-demand cities but as remote work increased let people spread out more, it affected more and more locations. Meanwhile, people living in the highly economically productive areas with the greatest housing shortages say there's no need to allow more housing to be built because remote work solves the problem. They speak out of both sides of their mouth though, as a few short years ago they denied that shortage caused the affordability problem, but when there's something that can be used to lessen the shortage (remote work, banning AirBNB), they grab on eagerly to the the shortage explanation for housing affordability.
The story of the housing shortage in the US is people desperately, by any means they possibly can, avoid addressing the shortage and being realistic about it.
> willful deception to prevent action on the cause.
I don't care either way. I don't live in the US. Action or non-action, I'm unaffected by that. There's no reason for me to "willfully deceit" anyone, as I don't stand to either gain or lose with any outcome. There's also no reason for you to frame this as a personal attack.
I've checked Zillow though.
There's a plenty of $1 homes, mostly dilapidated and non-functional even though the land could be worth $1 if one can afford demolition and rebuilding. But at the range of $10,000 to $15,000 there's a lot of pretty normally looking homes. Even if one doesn't have that amount as a down payment, I assume plenty of banks would be willing to give a mortgage for that sum with 25 years of $150/mo payments.
The problem is nobody wants to live where these houses are, because it's not SF, while in SF there are a lot of options under $2M, but not many people have that amount of money.
"Housing shortage" doesn't exist. The only shortage that exists is the shortage of $10,000 homes in SF.
The benefits of a pension start before retirement. I can avoid setting aside a huge pile of money for the potential of an extremely long lifespan and then worry as minor changes to that pile become really critical near and in retirement.
Suppose you’re 50 with 2 million USD in savings. How soon do you retire and what kind of lifestyle do you live in retirement? That’s heavily dependent on what kind of pension is waiting.
Where I am, we have personal pension plans. You can check "what kind of pension is waiting" by going to the pension fund website and checking. Gov't doesn't play any role in any of that -- except (a) prescribing mandatory monthly payments and (b) determining the retirement age. (You only get small gov't pension if you haven't collected enough in your personal plan.)
If it’s just a savings account, that’s not a pension.
Assuming it’s a defined benefit pension it can include guarantees like inflation adjustments. That’s the fundamental advantage which largely offsets the disadvantage of lower average returns.
I’m not saying putting all your money into a defined benefit plan is ideal, but due to the diminishing marginal utility of money a guaranteed minimum lifestyle is extraordinarily valuable.
It’s not a savings account. My employer and me both pay monthly installments to the pension fund which are tax free. The money is under my name, and is not used to cover any pension of someone else. I can decide where it is invested, and change it at will. Once I go into retirement, the accumulated amount is converted to monthly payments which I will receive until I die. If I die before the retirement, my children will inherit the whole sum.
In the way that (a) its establishment and monthly installments are mandated by law, at the rates prescribed by the law (which isn't the case for "a savings account"); (b) both the installments and the revenues are tax-free by the law (which isn't the case for "a savings account"); and (c) the law that establishes the fund, mandates payment on both employees and employers, determines the rates, and frees the whole thing from taxes is called "the Pension Law" (which isn't the case for "a savings account").
However if all of the above also applies to savings accounts in the place you reside in, you can call it "a savings account" all right.
I think this is very similar to 401(k) or IRA, or "self-invested personal pension", except that the law sets the percentage amounts for both employee and employer contributions.
It could be, to the contrary, that the legislators have come up with "straw bale" as something that simply does not belong under the bridge, in order to raise the brows of the people navigating the river, and make them wonder what's going on, all that in order to draw their attention. If so, it serves its purpose even more as straw bales are getting less common.
That's the British system working as designed. If there's a law, no matter how ancient, the British should comply. If a law needs to be changed, that's the Parliament's job.
Even the British courts, in sharp contrast to many other places, "deliver the law as it is, and not as we wish it to be" -- see for example [0] or [1].
"When the headroom of an arch or span of a bridge is reduced from its usual
limits but that arch or span is not closed to navigation, the person in control
of the bridge must suspend from the centre of that arch or span by day a
bundle of straw large enough to be conspicuous and by night a white light."
Does that mean the law is not being complied with, in this case, since the bales are hanging from adjacent bridges, not the "centre of that arch or span" itself?
Delays due to trucks striking bridges are a worldwide problem, at least in countries with railroads. Despite yellow black striped reflective panels and height warning signs and sometimes height detectors that trigger flashing red lights.
Perhaps we should try a bale of straw next.
The London Blackwall tunnel has a more modern take on checking height: https://maps.app.goo.gl/b5P5Td1hsuSjLU3w8 traffic signals, barriers like at a railroad crossing, giant panels across the road at height, and a police car on standby to pull out and fine anyone that doesn't read the signs - I presume this happens often enough that they can justify the cost.
But then the bale of straw applied to ships not vehicles and bridges not tunnels.
Your link shows the Dartford Crossing, an M25 bridge miles downstream of the City. The Blackwall Tunnel runs under the Thames at Greenwich and afaik just has the old school hanging metal blocks at height https://maps.app.goo.gl/N5xSF148ggLVTDtS8
It doesn't surprise me too much that police are on standby, a closure of either tunnel or bridge has a major effect on traffic all over London
There are additional traffic lights on the blackwall tunnel further in and a slip road out that can be used for overheight vehicles. I do remember having a 10-15 minute wait once while they sorted things out when a lorry driver got caught.
I'd have a feeling there are automated signs prior to the tunnel (or at least used to be) but I've not been through the tunnel for a year or so and things will have changed with the Silvertown tunnel opening.
I have seen someone not paying attention at the Rotherhithe tunnel and the roof of their van was a mess (and they're going to pick up a fine probably due to restrictions, the 2 tonnes gross weight limit is lower than a lot of van drivers expect)
I presume the Blackwall one is that unlit LED sign just at the start of the off-ramp. Then there's another set of height detectors on the same post to catch out anyone who's still not paying attention.
I question who approved that the main lanes ahead of your link have 2.8m/9ft limits but the police warning says vehicles over 4m/13ft will be stopped. Can I take my 10ft truck through or not?
I'm starting to feel a tiny bit of sympathy for drivers that get confused by this.
Blackwall seems also to have two sets of lights and barriers, and an off-ramp in between. That's probably for fire safety too to close and evacuate the tunnel and get the emergency services in, but I imagine it's used for height detection too if a loud CLUNK on your truck cabin isn't enough.
As an aside, the person who signed the original heights as (13ft)(4m)(9ft)(2.8m) needs to learn a bit about UI design. Yes, two lanes, but the gap between the central two signs is far smaller than to the other sign for the same lane. Also 4m is just over 13 ft 1 inch, which there'd be space to include as there's already a 0 on the leftmost sign (and from the rightmost we see that decimals are allowed on signs). Guess we're going to rely on the CLUNK after all.
In Germany even this wasn't enough, in a couple of bridges they had to constrain the road leading to the bridge in a way that only small cars would still be able to reach the bridge under repairs.
I also imagine it wasn't cheap doing this, but apparently as long as people can get away with something there is always those that will try, regardless of how it impacts others.
Its an ancient practise, codified into law in 2012 when the regulatory framework was re-codified from multiple laws like Port of London Act 1908 as well as time immemorial acts like this.
> Even the British courts, in sharp contrast to many other places, "deliver the law as it is, and not as we wish it to be"
The English practically invented the idea of common law. Even today there are still important legal principles based entirely on the decisions of earlier courts.
The constitution of the United Kingdom comprises the written and unwritten arrangements that establish the United Kingdom of Great Britain and Northern Ireland as a political body. Unlike in most countries, no official attempt has been made to codify such arrangements into a single document, thus it is known as an uncodified constitution. This enables the constitution to be easily changed as no provisions are formally entrenched.
In the US we only have a remnant of that in the Senate, in what has been popularly marketed as "the Nuclear Option." A Senator just makes a point of order that a Senate rule is the opposite of what it actually, verifiably is. The chair denies it, the Senator appeals the decision, and a majority of the Senate then overrules the chair.
After this has happened, the rule just changes and whatever was not in order in the past is in order in the future (or vice versa.) In the Senate as in Parliament; the majority of Parliament is the law, it can't break the law.
There are still important legal principles in the US and other places around the world based entirely on the decisions of earlier English courts. The first local decisions will reference English cases, and English legal experts often would have been consulted.
Same thing with most of the world's parliaments and congresses having to reference English Parliamentary precedent in order to figure out how to operate themselves. The UK Parliament and courts may be terrible, but they invented the thing and we're forks.
Yes, let's mention Roman Law in relation to British Common Law. The latter derived from the former, but there's a fair distance of about 1,000 years between our three points in time.
For all intents and purposes, every precedent and matter of jurisprudence can be resolved by referring only to Common Law. It would be rather exhausting and absurd to try and reach back past 1066 AD because things have changed, a lot.
Now in terms of forking Roman Law, there are other legal systems which are not directly related or derived from British Common Law. Especially the Napoleonic Code, which influenced Italy, which in turn influenced Catholic Canon Law. So here we have another lineage and a deeper "fork" from Roman Law where British Common Law doesn't really figure.
Also someone commented with a non sequitir about "antidisestablishmentarianism". I'd just like to point out that that word refers to revocation of things like the 1st Amendment and support for the Established Church laws, because it's "anti-dis" double negative.
If you want to talk about the United States' 1st Amendment, "disestablishmentarianism" is the term used to describe how the Founding Fathers set up the States without those meddling bishops.
Usually the judges do not "ignore or modify" the law, but rather "interpret" it in a creative manner. You might use, as an example, the question of "does the US Constitution guarantee the women a right to abortion." Some judges decided that it does, later some other judges decided that it does not. Considering the opposing outcomes to the same question, it's clear some of these were wrong.
One could argue that 'a corporation has personhood' is a technical contrivance that tries to manipulate the letter of the law into achieving a particular outcome. Going with the spirit of the law instead, that argument would never hold water.
It could be a disaster for the courts to interpret them too literally (Is literally any weapon OK in the 2nd? Does free speech include a mob boss ordering a hit?) and constitutions are really hard to amend, so heavy interpretation is a nessessary evil.
That is an interesting example because the second amendment is I think a primary example of a law that is very creatively read by folks that consider themselves literalists.
if the 2nd amendmend was literally interpreted it would be (quoting from memory) “in order to form a well-ordered militia the right to bear arms shall not be infringed”
As in you cannot infringe the right to bear arms in a well ordered militia, but gun ownership might be regulated for example by the militia organization owning the arms. Nothing would speak against codifying in law what constitutes a well-ordered militia, etc.
>A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed
It's the only amendment that comes with a justification so it's unusual but there's nothing in the text that limits the right to the listed justification.
> if the 2nd amendmend was literally interpreted it would be (quoting from memory) “in order to form a well-ordered militia the right to bear arms shall not be infringed”
I don't agree at all that this is a case of creative reading. The actual text of the amendment is "A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed."
Note that the text does not say "in order to" or anything like that, which is why interpretation of this amendment gets controversial. Was the intent that bearing arms is only a right insofar as people are part of a local militia? Was the intent that people must have the right to bear arms and the militia was simply cited as one example of why? It is genuinely unclear from the text, which means that no matter what we do we have to layer our own interpretation on top. That doesn't mean anyone is reading the law creatively, that's just the unfortunate facts of having to deal with an unclear text.
Or it means that if the government needs to call in levies, it would be good if the volunteers could show up with appropriate weapons. Ironically an automatic rifle (think an AK) could be what they're talking about, while pistols (being arguably useless side-arms in a battlefield) might be far less in the spirit of the law.
Historically, not owning a sword or longbow could get you in legal trouble in some cities and time periods, since it meant you weren't capable of helping defend the city. I'd say that in the spirit of the law, it should mostly allow the ownership of useful infantry weapons, or dual purpose ones (hunting rifles?), rather than self defence pistols.
But the US interprets it differently because the constitution is a bit vague, the constitution is hard to change, and practicalities and politics exist.
That's... how the 2nd amendment used to be treated, actually: state laws against conceit carry have lo-o-ong history, and they've been held to be perfectly constitutionally until recently. Oh, and "well-regulated" used to mean "well trained and supplied" back in those day.
And the 2nd actually reads (if you fix its grammar since it's ungrammatical by the standards of the modern English language) "since the well-regulated militia is necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed" — now notice that it's a conditional rule, and its premise in "since..." is no longer true, militias are not necessary for the security of a country; and so the conclusion should lose its power. And arguably it's what the Founders intended: if they meant it as an absolute rule, they would've omitted the first part of it and would have simply stated that "the right of the people to keep and bear arms shall not be infringed", period.
The Supreme Court in the United States has been playing a looooot of "Calvinball" recently. They've never been completely immune from it, but it has gotten a lot more nakedly political.
Gone are the days when everyone was spammed with Monty Python references. The Gen-Zs in my office haven't even heard of, let alone viewed, the Holy Grail so half the references our boss lays out are lost on them. At least it's not dead yet.
On the other hand, I had to ask them what a Kirby was. I'm still not sure but I know it's pink.
That's sad. It's not like when I was watching Holy Grail in the late 80s it was in theaters, and the "effects" weren't good enough when it was made to become dated. We watched it and lots of other stuff on VHS because it was good, regardless of when it was made.
I suppose some of the jokes depend on cultural things that might not be taught as well anymore, like the Trojan Horse. But most of it is about human nature, so it seems like that should hold up.
Interesting, at least 10 years ago, everyone in my school knew Monty Python. Maybe that's because it was on Youtube at the time. Not really the case anymore; some is still there but a lot has been removed - you're not going to find 'Holy Grail part 1/11' these days.
I've noticed similar. I quote lots of movies, usually one liners as appropriate. Between age and less uniform media exposure, my references more often than not fall flat. And I feel less connected.
If you talk to anyone under 30, there's a vague sense of 'the past' with a few landmark events - mostly Star Wars, Pokemon, Miyazaki. Beyond that it's all recent comics, superhero movies, video games, and anime, with a big subculture stanning book trends like romantasy.
Most of what happened before 2000 doesn't seem to exist in cultural memory.
It's not quite true that nothing that happened before 1950 exists at all. But you're not going to find many people who are interested in the art, music, literature, design, or architecture of earlier decades - never mind centuries.
It's as a big a break as there was in the 60s. For that generation the 50s were still an influence, but anything earlier pretty much just disappeared.
I guess the sense of a rubicon at the end of the 40s was due to WW2, but why at the year 2000? Because phones? Or big round number effect, perhaps? The year 2000 was built up in our minds as when the future was expected to begin. (Every new gadget produced around 1990 was the Something2000. CarVacuum2000, Ionizer2000, SuperShoehorn2000, etc.)
The "less uniform media exposure" phrase invokes the (paranoid?) fear that we might lose common cultural reference points. In short, today's kids watch whatever. Though I'm sure we'd just find a new social script to work around the inability to quote Python.
That's common in both European courts (look at e.g. the history of homosexual marriages in the EU) and in the US ("Citizens United").
The core issue is that no Constitution, in fact no law or decree at all can account for all possibilities that real life offers, and so all the bodies of law are up for interpretation all the time.
This is also the case in the UK. Where things are not crystal clear they are interpreted by judges and can become precedent (see the recent “definition of a woman” interpretation).
The issue highlighted by, say, the Owens vs Owens example, is that the law as it stood was clear and not open to interpretation, though obviously unfair. The law needed to be changed, which required parliament.
Religion is what it replaced. Where one person, with a clique of courtiers who personally relied on him for power, enacted whatever took their fancy. Their word was power, whether it was starting wars or forging alliances with unsavoury countries - and woebetide you if you challenged it.
I think you're conflating religious beliefs with ethics. You can't have a religion that is flexible on beliefs, otherwise it is not a religion, but the actual core religious beliefs are fairly limited. In Christianity, Jesus dying to reconcile the world to God is the whole point; without that it is something else. The whole point of Buddhism is that all emotions are pain, and that realizing that everything really nothing (since all composable things are impermanent and everything is composed) is the path to nirvana. All the other beliefs and ethics come out of this.
But even "submarine" religions (ones that people do not think of as a religion) follow the pattern. Communists worship the State (or perhaps the Party), because the problem with society is the structure of society, so only the State can bring the salvation of equity. American Progressives worship sexual identity. Progressives are flexible--except if you don't accept a particular identity, think that gender is not malleable, refuse to use pronouns, etc.
However, I think even "most religions" are not very flexible. 50% of the world's population are either Christian or Islam, and both are pretty prescriptive in the ethics.
No... I'm afraid you're dividing up a religion in a way that anthropology does not.
You are close to something. You've found the division between worldview, and religion. A worldview is bigger, and is individualistic - but generally founded upon tenants shared by others. "a framework of ideas and beliefs forming a global description through which an individual, group or culture watches and interprets the world and interacts with it as a social reality."
I will say: There is no accepted definition of a religion. There are hotly debated definitions, but no concrete and agreed formation of what it constitutes.
However, generally speaking, a religion is a set of socio-cultural systems, generally tied to a set of beliefs, that tend to have supernatural or spiritual elements. However - the systems are essential, the beliefs are not. [0] Many agnostics and atheists follow religious practices, and form their own religions. There are Christians who do not believe in Christ.
Because the core of a religion is social and cultural, it greatly varies in time and place. The Christianity of Early Rome would be unrecognisable to most Christians today. The religion has changed almost every single practice, over time, because of the cultures that have influenced it today. [1]
[0] An example would be "Jewish Atheism". It is a religion, with practices and rites, but it does not carry with it supernatural or spiritual beliefs. Another would be "Mainline Protestant Buddhism", also known as Secular Buddhism.
[1] An example of one of the most important rites in early Christendom that is no longer regularly practiced in Rome, would be the washing of feet. The host welcomed their guests on their knees, caring for them. Society moved on, shoes and roads changed, it no longer became necessary, and the religion changed around it.
You have one too many negative prefixes there. The Church of England is already established. Those who want to remove that status are proposing disestablishment. Antidisestablishmentarianism is the desire to maintain the status quo.
As someone else pointed out, you're over-negating establishmentarianism.
But it doesn't matter anyway - the UK is a pretty atheist society and getting more so... Those who self-describe as "atheist" in the census:
2001: 15.9%
2011: 25.7%
2021: 37.8%
In Scotland, figures are higher with a majority of the population now describing themselves as atheist (51.1%)
And these figures are only for people who describe themselves as atheists, not just agnostic. The number of people saying they believe "in god" was only 16%, and expanding that to "any god" bumped it to 27%, according to a YouGov poll in 2020.
Also, the most irreligious government ever is the most-recent one, with 40% of MPs opting to make a secular affirmation of service rather than swear a religious oath.
Basically the god-squad is done in the UK, it's just a matter of time - which is odd for a place with an official state religion, as opposed to somewhere like the USA which is officially non-affiliated with any religion, but has "christians" who wouldn't recognise Jesus unless he was white, toting an uzi, and telling them to give him money now to get a great afterlife - "prosperity gospel" my arse.
The problem with this idea is that it's unattainable.
Let's start with something simpler: a living room. There's no universal design that would fit just any living room. The layout and the set of furniture that would work for my living room will not work for yours. Size, shape, windows, doors, connection to other spaces - everything matters. If you want a great design for your living room, you literally need to start from your specific living room.
There's a great idea -- why don't we come with a resizable (reflowable?) design that could fit any living room in the world? While this idea might be entertaining for an engineer's mind, it doesn't work in practice, unless you can settle for just a mediocre design.
Also, being intellectually honest, we need to attack the strongest Apple we can imagine, not a weak Apple that's easy for us to attack. And that strongest Apple will never adopt this idea because they aim to design the best computer/phone/tablet that they can, and in order to design that they need to start with the computer/phone/tablet.
The idea of a phone connecting to a display/keyboard/mouse and becoming a computer has the problem that you could either optimize your design for what you have with a phone, or for what you have with a display and peripherals. It will never be as good as the system designed from the ground up to be just a single thing. It's always nice to have options, but there won't be any mass adoption for the mediocre combo. It was dead in the water with Palm Foleo in 2007, it's just as dead in the water 18 years later.
> could either optimize your design for what you have with a phone, or for what you have with a display and peripherals. It will never be as good as the system designed from the ground up to be just a single thing.
This conclusion looks unsubstantiated to me when you speak of the modern devices that have a sufficient performance for most typical tasks. You probably can't design a gaming computer+phone but nothing prevents you from making a GNU/Linux phone able to work with different desktop environments depending in its current mode. Indeed PureOS and Mobian operating systems already offer that and work well on my smartphone (Librem 5).
Not talking about what's possible or impossible, just not seeing it gaining any significant market traction.
Modern pentathlon is a sport where athletes ("pentathletes") compete across five different events. Even the best of them would be mediocre at best at any specific event competing against athletes who have trained for that specific event.
While I love Hanlon's razor as much as anyone, I love Occam's razor even more. And the simple explanation for that is that people don't like to use a mediocre combo when they could be using several purpose-built devices which excel at a specific job.
I like to go around with my Leatherman, but when I have a real job I get my toolbox out instead.
> it usually can do most tasks sufficiently well
An RV could do many "car tasks" sufficiently well, and it can do many "home tasks" sufficiently well, too. And while it has its place, it can't compete against the car + home combo, which is why most people have cars and homes but not RVs.
> > or insufficient PR
>
> Aka "stupidity".
>
> While I love Hanlon's razor as much as anyone
Stupidity isn't necessary here. A small company simply has no resources for a large PR campaign.
Also, nothing prevents a performant iPhone from allowing a full desktop mode except artificial software restrictions resulted from the greediness. Follow the money.
This is exactly what we're asking for. We don't want to get rid of our tablets and laptops. But we want leatherman-level functionality from our phones.
That's perfectly fine. You can get it today from Samsung, or maybe soon from Google. My argument is that Leatherman is not going to sell anywhere remotely to the level of the combination of purpose-made tools that it seems to incorporate. You're welcome to disagree.
I'm not so sure about unattainable. Many programs exist that serve the user in very different contexts, for example responsive websites, web apps. Or games that work on console, like Steam Deck, and PC. Or the Nintendo Switch, which can be used in handheld mode, with a small screen and battery, or docked, connected to a TV. Controllers attached or unattached.
Now, I can see problems too: docked and portable modes need very different performance optimizations. But I'm sure that software can handle this, for example, IDEA IntelliJ has power save mode, and OS-es also demonstrated that they are fine on portable and connected systems alike, like MacOS, Windows, Linux.
It's also not a problem that some things are not available in both modes. For example, Switch has games that explicitly need docked mode, for example, Super Mario Party. Yet both the game, and the platform is popular.
I see no reason why a phone couldn't be a mediocre, or better PC.
A phone could be a mediocre PC. In my opinion, it will not gain any significant market share competing against other PCs -- mediocre ones, good ones, great ones, and "insanely great" ones too.
I can imagine a scenario when it's really useful, when portability is concerned. This segment right now is served by web apps, which essentially give the same end result, supposing a working internet connection: the user can have the same software and the same files on multiple systems, like a phone, and a PC. The device itself being portable would be the same, but infinitely more private: all the apps and files could live locally. Now, I don't know how large that market is, and I suspect that it's not that large, given that people are just fine with cloud based solutions.
In another words, I think that the functionality itself is very useful. It's just that it's currently being served somewhat adequately by cloud based solutions. For this reason, such a phone-pc product could not offer much in terms of functionality, and so, it might not be popular at all.
I don't think we can fairly compare a phone pretending to be a desktop against other desktops.
It would be more fair to compare a phone that has desktop features, to a phone that doesn't have desktop features.
So let's compare the best Apple phone that refuses to have a dex like experience; to a Samsung that has had a dex experience for about 10 years, or to a Google phone that is now adopting desktop experiences.
If the future is anything like the past, in 5 to 10 years from now we'll see a desktop experience on iPhone and they're going to be snobby about it.
You don't have to degrade the phone experience at all. Just add a Linux VM that it switches to when I plug it into a monitor and has access to my files and we're good.
To me, something along these lines is by far the best approach. Under Android, a Linux desktop could be virtualized on top of Android’s Linux kernel and under iOS, a macOS userland could be virtualized on top of iOS’ Darwin underpinnings.
It’s the only way you don’t end up compromising either half of the experience too much. Trying to converge both into a single UI as Microsoft previously did with Windows and GNOME is now trying to do now is a recipe for failure.
Do you have any examples of desktop UI interfaces that are impossible to create in a Dex like experience, that is possible on Linux, Mac, and Windows desktop experiences?
Or is it that mobile apps will never work great/ideally on desktop? And if that's the case, how is that worse than not having them at all?
My argument is not "this is impossible to create", my argument is "in my opinion this will not succeed in competition against purposefully-made devices." You're welcome to disagree, of course.
> There's no universal design that would fit just any living room.
Yes. This is why any furniture store that thinks that they can just offer three designs which would cover 90% of people is going to inevitably fail, as evidenced by Ikea. It's a ridiculous idea that same three designs could be adapted to living rooms across one country, let alone entire globe.
Oh wait. Half of the word's population uses the Bestå cabinets and Nordviken chairs.
There is tremendous value in git, however you need to take into consideration the fact that 100% of the people and companies who pay GitHub (and thus, generate value for Microsoft) have the option of just using git without GitHub, for $0.
This causes me to think that what these people are actually paying for is GitHub, and not git.
Which causes me to think that, when Microsoft valued Github at over $7.5 billion, that value reflected the value of GitHub sans git.
Linus is in business of advancing the humanity, without making billions in the process. I can only salute him for that.
reply