Hacker News new | past | comments | ask | show | jobs | submit | elwes5's comments login

They sorta did then sorta skipped everyone else. The real requirement was that their bootloader was in the first partition. You can install windows to a different drive/folder. It would probably mess with a lot of programs out there though. The later versions (past 2000 I think, maybe xp its been awhile) did not really give you the option to put it somewhere else.

https://docs.microsoft.com/en-us/windows-hardware/drivers/de...

The problem is they only really made it work correctly with OS2, WindowsNT, Win9x, and DOS. Even then it was kind of a pain to make it work correctly.

Have not messed with the UEFI bits they added in recent versions so it may be different now. But I doubt it.


[flagged]


Did not realize I had to do full on research before posting on everything? My point if you had read it would be boot.ini allows multi boot in particular cases. I am not sure on the UEFI bits as I do not really multi boot anymore. I am sure it is great or miserable. I just personally have little need to do it anymore as VM's tend to solve the particular issues I ran into. Did your browbeating me bring much to the discussion? Why did you bother to comment other than to mock me? Am I understanding YOU correctly?


I agree on the vertical space thing. I would add that lack of contrast between controls is very annoying. That 'flat' look is a pain. Can I or can I not click on this? The one that is a real pain is the disappearing scroll bar. It is there. The space for it is used. But just gone until you hover over it. Then it is clunky to use in relation to all of the other windows controls. The rules for making a GUI were dead simple for MS in the 90s. I had them printed out on 1 sheet of paper and most of that was a couple of pictures.


THIS.

I still have to think when I look at the control center on my phone, uh.. moon == Do not disturb, and is it on or not?? The faux-3d let you know. I also wish there was text under the controls.

I've resorted to turning on some of the contrast accessibility stuff to make it a little more usable.

And don't even get me started on the hamburger menu.


>lack of contrast between controls is very annoying. That 'flat' look is a pain. Can I or can I not click on this?

Vs code. How many seconds does it take to figure out the active code tab?


I must use scrolling windows differently than you, because I haven't clicked and dragged a scroll bar probably for years. The only thing I need a scroll indicator for is to see how far I am down the page, which I only do when I'm actively scrolling. MacOS disappearing scrollbars (which do not waste space when inactive, but pop in over content when you scroll) are my ideal implementation.


I want to know current position and I still sometimes grab scrollbar and drag because it sometimes comfortable than scrolling a wheel.


My point is that it is terribly inconsistent. There are basically 4 ways now. You can see the scroll bar. The scroll bar is borderline invisible color wise to the background. Fade in out, some are over the text (which is annoying if you happen to be reading that bit and activate it). Fade in out space kept but the control is invisible. The fade in out has a second issue. Can I scroll? It suffers from the 'bellow the fold' problem of usability. As in there was more stuff on the page but I missed it. All because a paragraph break worked out just right.


I often like to know how long a page is when I'm not scrolling too.


Your perspective is not just 'non-us'. The thing is since the early 80s computers can generate huge swaths of data. ML gives you a way to filter that data in a particular way. The same was true of data warehouses, smart systems, etc. The issue is not the business people vs technical people. It is an understanding of what do you want it to do. A few years ago I had a system that could generate 2k in data samples every few seconds (switches, voltages, temps, etc). What are you even looking for in that pile of stuff? You can not just feed that into a ML network and hope for the best. You have to describe what you are looking for. I had this same conversation over and over when working with data warehouse projects. A good BA on a project like that is amazing. Someone who is just kinda meh on it will kill the project dead. I do not understand your business, you do. I can apply what I have learned from other companies but only to a point. After that point I basically have to become a BA in that company just to understand what to write.


I know the Verizon one does. It just is not on their pages. The platform they bought from Qualcomm to do it was in that same space as Twilio. Been a few years though they may have ripped it out by this point. 10 cents per MB and 5 cents per SMS. Ouch!


Building fast applications on most platforms is usually possible. But you must play to those platforms strengths. I used to mess with my fellow devs by making toy vb6 apps that worked very quickly (not my favorite environment to work in). It usually meant throwing out tons of junk. Could never pull that trick off with pcode vb applications. That pcode interpreter was just too slow. I would also do this with win/C++ programs you can make surprisingly light applications if you know what your platform does. The mistake many people fall into is just tossing whatever in there and hoping for the best. That works at first when you are spinning up but usually you end up having to do something about it.


Big O gets you in the right ballpark of what to look at. That extra 'C' bit that gets left out can doom it to be worse than other items though.

For example for very small sets of numbers you could basically pre-sort every combination there is then have a very large lookup table. Much like a rainbow table for passwords. Your O is basically a binary search lookup O(log(n)) or even better O(1) if you can make it linear. However, the upfront cost is huge and storage cost is huge, lookup would probably be big too. Hmm, now that I think about it this could be an interesting thing to mess with. Also at this time anything past 4 bytes would be unusable. Hmm, maybe later. Like you point out a 'galactic alg'. Portions of that thinking can be pulled out and used for other items though.

With sorting using the comparison is a good proxy for if it might perform well. But it is just that, a proxy. Like the weird sort I just made up. There is probably a few shifts and muls in there. Those are decently expensive and can blow your perf right out of the water.


> Big O gets you in the right ballpark of what to look at

Generally I'd say that's true, but even that depends on context. For sorting very small arrays, on typical hardware, you can't beat bubble-sort and insertion-sort.


Oh absolutely. The bubble sort sort thing usually comes down to the architecture of the machine. One of the things the O notation kind of hand waves away. On paper some things are faster. But put in 3 levels of cache, a CPU scheduler, a particular ASM instruction flow that makes things faster/slower and suddenly things are different. That is my biggest gripe with the notation. It is good to get you 'close'. But sometimes you just need to fiddle with it and try it. The 'C' bit can get you. On paper bubble sort is always worse. But it can run faster for small sets because the code and is small enough to fit into L1. Whereas maybe a mergesort implementation either the code or the data fits but not both.


This may get me some anger but I always found Scrooge to be a very sympathetic character. He starts off as a hopeful boy who just wants to spend some time at Christmas with his family and is shipped off to work as it is his time to do so. Gets dumped hard by his GF on Christmas, for she thinks he only thinks of money when he just wants to do right by her. His only friend died on Christmas. So by the time we see him he sees Christmas as a 'humbug' or trick. He is crazy rich yet spends none of it not even on himself. He is always looking for someone to play a trick on him. So when his nephew shows up and says 'come to Christmas dinner' he treats it with suspicion and blows him off hard. It takes the ghosts to show him that while some people are terrible most are just trying to get by and that his 'terrible' Christmases were mostly by his own hand. Their message share yourself with others for you do not have much time left. He gets the point. He even plays a bit of a 'humbug' on his fellow co-worker the next day when he gives him a raise.


I think this is pretty much the standard interpretation of Scrooge, those who know him best - Bob Cratchit, Fred (Scrooge's nephew), and Belle (his former fiance) - all feel pity for him.


Also do 1 thing at a time. Do not try to do them all at the same time. That is a fast track to nothing being done. Put them in order highest to lowest pick of an 'easy' win. Make it easy for people to want to do the right thing. Once in place retro it. Is it working? Why not? CI/CD takes time to build. These devs probably have not done it before and have no interest in it. Because they do not see any benefit. Also be sure you want this. Because in a small group you will own it for a long time.


The amount of crazy things people did in win3.x was pretty amazing. MS kind of did it to themselves. Want to know what an HWIN looks like? Right there in the header. No functions to access those items. Now you can not change that struct without breaking hundreds of programs. I remember the other devs grumbling that the win9x headers did not have the system structs in there. What was just an update into an a pointer deref became 'which method do I need to change this feature now...' game. That MS pulled that compat job off and did not absolutely obliterate everything I have to admire.


One thing to keep in mind with embedded is size. Some projects you do not get to have std lib. It just does not fit. This is less and less of an issue as time goes on. But a few years ago 2k total memory (flash and RAM) was a real issue you had to conform against. Even that could seem huge for some platforms.

Depending on the platform Rust or C++ may not be the first one I reached for.

C is pretty compact if you strip the libs out. But usually at that point you may have to goto ASM just to make it fit.

Had one project which had to be in python (platform dictated by customer). Did all the right style guide things. Classes, the works. Had to toss all of that out. As just by making classes subjected it to a 300 byte overhead per object. I had hundreds of the things. Half my memory was being used by object management. I needed that space for data. Out goes all the cool by the book things that seem right to do. Borderline a total re-write. I took those lessons to the C version. Right up until the new guy decided everything needed to be C++ and use std:lib and would not listen to me. Suddenly it did not even fit in flash and RAM even if you use both by about 5x much less any data needed. He had to spend weeks backing the changes out again. Even after re-stripping it even then the code size dictated we just could not use some platforms.

Another thing to keep in mind is compiler maturity. C++ in the past 10 years has come a long way. But in these embedded platforms you may be working with a C++ tool-chain from the late 1990s (if you are lucky). Some of the toolchains shipped with these chips are in poor shape. They will never change unless you spend a lot of time bringing it up to something semi current. Getting a Rust tool chain on it? Maybe if you spent months messing around with it. Months where you could spend shipping product.


C++ can be written in a compact way: in fact some of it's features are useful in that regard. But you will probably want to turn off exceptions and RTTI (two features which are unfortunately not 'zero-cost').


Like you say it can be done in a compact way. There are some gotchas with memory though. Like you point out RTTI and exceptions. Another one people forget about is the object overhead themselves. That is not zero cost on many compilers. There usually is an abstract struct that holds the vtable or something like one. That is so you can do those cool C++ things like inheritance and mutability. Most of the time when people say 'c++' I have found they do not mean the language which is decently compact. They mean the C++ std lib. It is a distinction many do not make unfortunately.

It is becoming less and less of an issue as the more powerful SoC items have come down in price and have more memory/flash. The newer ones also usually come with a somewhat modern compiler stack. Also if you follow some of the MISRA standards in some projects it can even more radically change what you can and can not do.


Rust avoids a lot of the C++ pitfalls out of the gate with a split core vs std library set.

Panic handling and message formatting are something that however is a known drawback in debug builds. In release builds this gets optimized away in most cases.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: