Hacker News new | past | comments | ask | show | jobs | submit login
Operating Systems: From 0 to 1 (tuhdo.github.io)
709 points by tuhdo on Feb 14, 2017 | hide | past | favorite | 79 comments



tuhdo, can you say more about how the process works to convert your book from lyx text source to pdf? Since the current pdf seems to have particular formatting added to the source text (eg, terms not emphasized in remaining.txt are italicized in the pdf), it's a little tricky to tell. Is the raw lyx file where the formatting rules live? Does the lyx file store its text source in text files, but its formatting rules in binary?


From the book:

"If a programmer writes software for a living, he should better be specialized in one or two problem domains outside of software if he does not want his job taken by domain experts who learn programming in their spare time."

Seems a bizarre sentiment, but after reading this sentence, I feel like I really wanna donate some money to the guy. If he gives a way I will surely do.


The replies to your comment took a turn to whether we will need programmers or we won't need programmers in the future, let me take it in a different direction.

I think what he's explaining is that for most (maybe all? no empirical evidence to back this up) programming and computer software in general is a means to solving a problem.

Operating Systems solve the problem of needing a foundation in which to develop higher-level tools that enable computing. Whether the higher-level problems are complex astrophysics calculations or hailing a ride, as you travel down the stack, each piece of software is solving some problem domain.

Point of this sentence is to say, as you move forward in your career, you (usually?) develop knowledge of an industry, a customer base, a constituency or some other problem domain in which you combine your skill in engineering with your knowledge of the domain.

If you don't do this, and you say "I can do anything but I have no special knowledge of any problem" then it's more likely a person with both problem-specific knowledge and engineering knowledge will be given the job.

However, there are so many problem domains in the world of computing, it's very easy to pick one or two that interest you personally.


Programming is an analytic process, gathering requirements, understanding or creating a process, and translating that into the language that computers understand. Only the last bit is likely to be made simpler for non-programmers in the near future. So "programming" is a skill on its own, rather like "management".


Can you explain more what inspired you in this sentence or are you still thinking it's bizarre? And if you do still think it's bizarre why would that convince you to donate?


not OP, but one thing we as developers should recognize is that programming itself is likely to become merely a skill like operating a spreadsheet or typing. In much the same way, writing was once a skill available (or at least, practiced) by a very small subset of people, and the spread of literacy made the job of "scribe" largely irrelevant over time. Children today are learning how to program—perhaps not yet in the numbers we might have imagined, but a growing number. As they grow up, they are not going to pursue jobs as "programmers" or "developers", any more than I would pursue a job in a typing pool. They will merely incorporate this skill into whatever role they end up in.

Those best suited to compete in that scenario are going to have deep experience in some area where programming is an applied skill, not an end unto itself.


While a agree to some extent and have even explored the idea of how my programming skills could be augmented with other non-programming jobs I still feel that for the foreseeable future we will have programmers.

I see lots of jobs being done by non-programmers that would be done much much better had that person been a programmer. The future will undoubtedly be full of people in traditionally non-programming jobs with programming skills. But we should not trivialize the systems that allow this to happen. While most programmers work very high in the stack there are many very deep layers that sit below that will require a army of programmers to keep going and keep innovating so more productivity can be had at the higher layers.

As an example I feel people often trivialize one or two liners programs, or short examples that seem to do so much for such a few terms. But even those simple examples are backed by sometimes 10 of thousands of lines of codes. A wave of a hand, and a bark of spoken command to a computer will -- at least with today's software and hardware -- never end up producing something new and innovated that will be widely used.

To get where you are suggesting many things have to change. And it's not all software. If you even take a cursory glance at what it takes today to bootstrap a operating system -- you would then see how complicated things get. For the future I feel you are envisioning the hardware will have to be worked along with software to make things much much much more modular than they are. More standardised terms to describe things and more finite building blocks will need to be made. In fact the closest thing I could get your future vision with today's technology is FPGAs and it is no trivial task to do anything useful with a hardware described language.

tl;dr We have not solved computers or software, therefor; a army of programmers will be required for the foreseeable future.


I don't think that day is as far off as you seem to. Yes, there are some kinds of programming which require deep expertise in programming itself, but these jobs are far from the majority.

If you've ever read Vinge's _Deepness in the Sky_, recall that the protagonists occupation for a period of time was that of a code archeologist, attempting to collect, categorize, and understand code that was centuries (or was it millenia?) old... We may be a ways off from that, but I don't think its far from the truth of the future.


Well, typing is more of a motorical skill than programming. Almost everyone can learn to drive a car (manual gearbox) but the same could not be said for programming where problem solving, logic and intelligence is a must.

But with that being said, people from other professions could still learn how to program and do it well.

Maybe you think of a programmer as person that is just typing in the program designed by someone else, a developer, in a specific language with a specific syntax?


I think a programmer is someone who writes programs, regardless of the quality of the code. Substitute spreadsheets for typing if you think its more appropriate. Not everyone with domain expertise knows how to use a spreadsheet, and not everyone who uses spreadsheets uses them as well as someone that, for example, writes software for a living (and not all of the latter use it as well as someone who only uses spreadsheets!) But good luck finding someone who's only job is getting hired to code up a spreadsheet. The vast majority of uses are to allow someone with just enough skill to create something that performs a job function in an automated way, which in turn makes another job that much easier. Once upon a time, it would have been a persons job to perform the calculations and updates that a spreadsheet makes trivial.

It would be foolish to think that the same will never happen to the vast majority of our jobs.


I see that it could be true for many desktop and mobile phone applications but who should develop the operating systems, embedded applications and backends? It is a full time job just to learn all the programming languages, libraries, frameworks, standards, protocols and other related things.


Well, the "programmers" in operating system and embedded domains are usually electrical engineers who learned how to write code, just enough to apply their domain knowledge. A programmer in the usual sense, that is someone graduated from pure software school, will have a hard time competing with those people.


We still have a lot of different kinds of scribes.

The number of people who can put together effective, performant and reasonably correct systems is not that large.


Interesting...... What will they do in the future? Because by then bank jobs would be gone, basic health care would be performed by robots, and even writing by machines. Hell, everyone would be at home .


I was misunderstood. What I find bizarre is my felling to want to give money for the reason that that phrase resonated with me a lot. The sentence he wrote is not bizarre at all. It is a gem.


Thank you. I will make a donation button once the book is relatively "stable".

About the sentence, sadly, but it is the trend.


I am sorry. English is not my mother tongue so it came off wrong. I didnt think the sentence was bizarre. It just resonated with me so much , maybe because currently I am involved as a programmer in a project where if I had more domain knowledge, my life would be so much better and I would be able to take initiatives a lot more. And I just got a feeling of a need to support your work. I called it bizarre, because at the moment it seemed too sudden. Looking forward to the complete text. Nice work :)


I think the entire section 1.1 can be removed as it seems a bit pompous. Other than that, it seems like a reasonable guide.


I can see your point, but the section created a nice "atmosphere" for me and it was not too long. I dont know how to express it clearly.


This looks great!

I would also like to recommend another free resource that might be a good complement(theory vs implementation) to this:

"Operating Systems: Three Easy Pieces"

available online at:

http://pages.cs.wisc.edu/~remzi/OSTEP/


I haven't yet finished this book, but this is very good book imo. Can recommend.


Looks very nice so far. One minor nitpick - there should definitely be a chapter on inter-process communication, it's an important part of operating systems.


Every time an OS book (about weekly) is posted on HN, I immediately want to jump all over it but I gently remind myself to finish: CMU's Computer System's from a programmer's perspective and Elements of computing (nand2tetris).


It seems you've already decided that these are worth reading, but in case you want some motivation, I had CS:APP as my textbook last term and I thought it was quite nice. I also worked through TECS several years ago and it was also a great resource. CS:APP is quite dry but by the end of the course I was able to write a real memory manager in C, which was super fun! If you're ever looking for an internet book reading buddy, I also need motivation. I worked through TECS with a friend over a few weeks and found it very useful to have someone else seeing and working through the same problems.

Good luck!


TECS? I would like to have online buddy to do this self-learning courses. Most of my projects fizzled due to lack of accountability.


TECS is The Elements of Computing System. Feel free to send me an email!


For those of you that have read, or at least skimmed, this, how does it compare to the Minix book? (which was a joy to read!)


How much time would it approximately take to work through this?


Seems to be for a x86 operating system. I'd have preferred some other architecture because so much of OSdev for x86 that I remember was working around quirks of the architecture (A20 gate etc.). I guess it's a valuable lesson but I'd really enjoy a fork of the book for the hardware you find in a Beagle Bone or Pi3 or something. Maybe this could be crowdfunded if the x86 version is popular?


Author here. Yes, it's a x86 operating system. However, rather than getting around A20, it focused on protected mode instead.

The book not only teaches x86, but how to use the official resources from the hardware manufacturer to write the OS. In sum, a reader when reaching part 3 for writing the OS, he will need to use the official document, in this case, the "System Programming Guide" manual from Intel to write C code that complies with the documents. Once he learned how to do so, learning other platforms will be much easier given how complex x86 is.


> Author here. Yes, it's a x86 operating system. However, rather than getting around A20, it focused on protected mode instead.

You still have to open the A20 gate in the bootloader if you want to access a memory adress that has bit 20 (counting from 0) be set to 1 (you probably want) - even if you switch to protected mode. The only exception is if you boot from UEFI instead of BIOS - in this case the A20 gate is already set. But the book uses BIOS as far as I see it.


In the Volume 3, it is said that:

"In protected mode, the IA-32 architecture provides a normal physical address space of 4 GBytes (2 32 bytes). This is the address space that the processor can address on its address bus. This address space is flat (unsegmented), with addresses ranging continuously from 0 to FFFFFFFFH. This physical address space can be mapped to read- write memory, read-only memory, and memory mapped I/O. The memory mapping facilities described in this chapter can be used to divide this physical memory up into segments and/or pages."

It correlates to my experience of developing in protected mode in QEMU. Once entered protected mode, I can access to any address above 0x10000 without being wrapped around. When I was writing my first kernel (https://github.com/tuhdo/os-study) in real-mode, indeed A20 must be enabled.


The A20 gate would affect both real and protected mode, so to be truely compliant, you should be opening it even for a protected mode OS. It was originally a purely hardware hack forcing an address line(A20, or the 21st address line) to stay low, thus "wrapping" addresses from 1mb-2mb to 0-1mb, so it would have affected real or protected mode identically. On modern hardware, there is a chance that A20 is (against the accepted spec) default open, or possible not implemented.


> On modern hardware, there is a chance that A20 is (against the accepted spec) default open, or possible not implemented.

To nitpick on this a little bit (consider it as a polite supplement):

In Intel® 64 and IA-32 Architectures Software Developer’s Manual Volume 3 (3A, 3B, 3C & 3D): System Programming Guide

in section

8.7.13.4 External Signal Compatibility

one can read (emphasis by me):

"A20M# pin — On an IA-32 processor, the A20M# pin is typically provided for compatibility with the Intel 286 processor. Asserting this pin causes bit 20 of the physical address to be masked (forced to zero) for all external bus memory accesses. Processors supporting Intel Hyper-Threading Technology provide one A20M# pin, which affects the operation of both logical processors within the physical processor.

The functionality of A20M# is used primarily by older operating systems and not used by modern operating systems. On newer Intel 64 processors, A20M# may be absent.".

TLDR: The accepted spec is that A20M# might not exist.


fair point, I hadn't bothered to look more, but it's still true that A20(when implemented) gates both real and protected modes, and therefore "ignoring it to focus on protected mode" is invalid


It's a from-first-principles guide to building an OS.

From the title, I had mistakenly assumed it was about the first OS ever.


Interesting. I was sure from the title that it's about building an OS, but it's fascinating how different people interpret things.


Like almost everything in Computer science there would be no clear definition of first OS ever. There used to be monitor programs and schedulers before they iteratively evolved into OS programs. Just like, programming languages, servers, 'cloud'... There seldom is a first ever something in our field.


Im curious what "monitor programs" means here in this context?

The first OS was OS/360 at IBM in 1964.


OS/360 was not the first OS. Quite a number of operating systems came before it.

Examples of earlier operating systems include SHARE Operating System (SOS), first version in 1959, for the IBM 709 mainframe; which in turn was itself was based on an earlier operating system, GM-NAA I/O, first version in 1956.

A very notable pre-OS/360 operating system was MIT's CTSS operating system, the first timesharing OS, first version in 1961 for the IBM 7094. The first versions of OS/360, by contrast, notably lacked timesharing – that was initially provided by alternative IBM operating systems (TSS/360 was the official answer and CP/CMS, which later became VM/CMS, the principal unofficial one); OS/360 only gained timesharing support itself when TSO was released in 1971.



I have on my list to read something along the lines like this, but this seems incomplete. Does anyone have anything similar?


It is complete, if you finish the first 2 parts, which consists of 8 chapters. Then you can work on your own by reading the Intel manual volume 3 "System Programming Guide", or learning from OSDev wiki. The first two parts provide a foundation to use such resources, which I think is more important than a step by step guiding how to write an OS.


The little book about OS development by Erik Helin, Adam Renberg http://littleosbook.github.io/#introduction


This is what I'm planning to read as soon as I've learned assembly: http://www.brokenthorn.com/Resources/OSDevIndex.html


I graduated from that guide :) You can see the repo here, written entirely in NASM: https://github.com/tuhdo/os-study

The problem is that the guide is out of date in terms of toolchain, and you need to figure out many things by yourself, especially if you want to develop on Linux. My book helps you to understand how to learn and write x86 with Intel manuals (this is really important!), understand how to craft a custom ELF binary that is debuggable on bare metal, which in turn requires you to understand a bit of how debugger works.

Once you get gdb working, it is much easier to learn how to write an operating system.


Oh, good to know! I'll keep that in mind and keep a bookmark of your book and your implementation. Actually, I wanted to start writing an OS by following the BrokenThorn tutorial and was quite naive. After reading some pages it came to me that I don't know much assembly and so I started learning from Jeff Duntemanns Assembly book [1]. As far as I can see, your book also teaches the basics of assembly, it seems more friendly to beginners. Maybe it will be a better start for me. Thanks for putting in the hard work!

[1] http://www.duntemann.com/assembly.html


It not only seems, it says it is incomplete which is not unusual for something that is a work in progress


Maybe not exactly what you're looking for, but here's a linux specific ebook similar to this[1]

[1]:https://0xax.gitbooks.io/linux-insides/content/


Well, the book is not Linux-specific but only leverages Linux as a development environment.


I feel like it focuses on a lot of linux specific details that a OS agnostic book would when using linux as a 'development environment' definitely wouldn't.

There are many topics other OSes like Windows and Solaris do differently and talking about them even for a little bit would be beneficial, but I haven't seen any trace of it.

Search for Windows, Solaris, shows up nothing, and search for unix shows a single page about unix signals.


Others are saying it builds up an OS from first principles, so Nand to Tetris might be comparable:

http://www.nand2tetris.org/


For a comprehensive book on the theory behind operating systems I can recommend "Modern Operating Systems" by Andrew S. Tanenbaum.

It does not focus on the concrete implementation of an OS though.


Operating Systems: Design and Implementation by Tanenbaum and Woodhull covers the implementation of Minix.


Well this has got to be one of the more ambitious things I've seen on HN. I wonder if the guy behind SkyOS is still doing operating systems. http://www.skyos.org/


> I wonder if the guy behind SkyOS is still doing operating systems. http://www.skyos.org/

No, he (Robert Szeleney, http://www.skyos.org/?q=node/411) doesn't. He (together with Heiko Hufnagl) founded a studio for creating software and games for mobile devices:

> https://en.wikipedia.org/wiki/Djinnworks

> http://djinnworks.at/about-new/


I will read your book....and thanks for such a book


The hallmark of a Good book is that it should leave you wanting more. That's what the book is all about.


Have you read the book?


On pg 38: Field Gate Programmable Array (FPGA)

Surely: Field Programmable Gate Array (FPGA) ?


Thanks. I will fixed it.


The grammar makes that book fairly painful to read :-|


Hi, I would love to hear your feedback. It is the style or the correctness? I am really appreciate if you can show me the few incorrect usages of the grammar.


I can tell you're not native. Have the book proofread by someone.


Yes, I'm not native. For that reason I tried to use simple grammar and sentences, and did go through paragraph by paragraph through a grammar checker until the end of chapter 4. I will try to improve it gradually.


It is better to proof read for correctness. I myself want to point out few grammatical stuffs and looking for the source in Github. You can simple create a repo or add the source text to the current repo, so that people can easily fix these issues for you by giving pull requests.

You don't want people judging this excellent book based on few language quirks here and there.


Thanks. I will upload the source soon. As it is written in Lyx, not sure everyone will find it comfortable though. You can always open an issue in the meantime though.


Great. I just created an issue. You might want to create few guidelines in the repo for creating patches regarding typo/grammatical issues so that you can focus on the actual content of the book. Nice job with the book.


Just curious why you chose Lyx and not Org, for example. I still have your "emacs mini manual" open in my browser all the time, so I'm sure you feel natural in org-mode too.


Well, Org is suitable for small notes. But for the scope of this book, it's difficult to make Org work for such a complex layout (largely because I don't want to mess with the CSS).

I used Lyx because it enabled me to focus on the content without all the markup text. Writing Latex in Emacs can reduce the distraction, but not enough. I just wanted to focus on the content at the time. Learning Latex is difficult enough, learning how to use the major mode at the same time doubles the difficulty.

Obviously, I still use Emacs daily for writing code and other things. Just not for writing book.


Please do! I would also love to help you correct grammar issues via PR.


Thank you! I already put the book source online. You can use Lyx to edit the book source file.


Regular spellchecker can correct mistyped words, but not, for example, words miused in a particular context, missing the/a, etc. Try some of those more advanced contextual spellcheckers and proofreaders like Grammarly or LanguageTool.


"miused"

https://en.wikipedia.org/wiki/Muphry's_law

If this was intentional - well played. If not - thanks for the unintentional demonstration of Muphry's Law :)

[ten seconds later] Now scanning my own post really carefully for misspellings and grammar errors ;)


Why not hire someone in Fiverr to proofread this? Not sure if that is too expensive. Also, have a donate button for someone to donate. Then you can use that money to improve the book. GL!


Nice service! I didn't know it exists. Thanks.


Spelling error: MOSFET is misspelled as MOFET the first time.


Definitely. The content seems to be there, but this would really benefit from some copy-editing.

From just skimming through a few chapters I can completely understand what the author is saying but the grammar is jarring and pulls me out of the book.

Hopefully the author uploads the LaTeX source files to the repo soon. Grammar fixes are pretty trivial.


Nice job man!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: