Hacker News new | past | comments | ask | show | jobs | submit login
A look back: The Bloomberg Keyboard (2017) (bloomberg.com)
112 points by bookofjoe on Aug 19, 2019 | hide | past | favorite | 55 comments



"The Grid" is a fascinating machine. If you're interested in it the full name is the GRiD Compass 1101. It used a display I've never seen on any laptop before, an electroluminescent monochrome one, really weird tech there. It's primarily used in static displays like car dashboards because it's very unlikely to "burn out" (not even sure it's possible). I've never seen them on dynamic displays before.

More fun facts, it was used in space, and perhaps even as the "nuclear football".


Once upon a time I had a 486 laptop with such a display. I kept it long beyond its prime because it was so superior to any other laptop display on the market.

Amusingly, the laptop featured a socketed 25 MHz 486 SX. I was able to swap it out for a 486 DX2, which added an FPU and an on-package clock doubler giving me a 50 MHz processor.


Considering monochrome wasn't really a limitation at the time, it makes quite a bit of sense as a display. It was certainly thinner than any CRT and so much lighter. I don't recall using one myself, how was the refresh rate on them if you do remember?


Its refresh was better than LCD laptop monitors of the time, which had horrible ghosting - remember windows had an option to give mouse cursors a tail because without that you’d be hopeless to find a moving mouse.

It did suffer especially from burn-in. Remnants of the WordPerfect status bar were permanently at the bottom of the screen.


It looks like the PLATO system plasma displays that were very old when I was in undergrad in the late 80s.

https://en.wikipedia.org/wiki/Plasma_display


Yeah, it reminded me of the gas-plasma monochrome displays that were common on "luggable" PCs in the late '80s (like this one: https://jshorney.incolor.com/p70.htm).

They were much lighter than a comparable size CRT, which was the display solution used on the original luggables from companies like Osborne and Compaq. But that was about the only good thing that could be said about them.


Saw one of the Grid machines at VCF West a few weeks ago. The screen was eye-catching; a vivid red with no perceptible flicker. Looked great (though monochrome) even by today’s standards.


Works on the same principle of passing a current through a substance to emit radiation, it's just the electroluminescent panels use a solid material, so I think they're able to be flatter.


I had one of these I got from someone who knew someone who knew someone in the Reagan administration.

If you pointed the screen at a TV set, the signal would go all screwy. Turn it the other way, and it was fine. That thing must have been shielded out the wazoo.


I keep wondering if Bloomberg has green and orange text in modern interfaces as a nod to tube-display terminals.


It’s not a nod, so much as a consistent interface over the years. Large portions of a lot of screens look mostly the same as they did 30 years ago.


The display is plasma not EL. Same was used on many 80's era laptops and luggables like the Toshiba 3100. They can suffer burn in from static displays just like CRTs and modern plasma screens.


http://oldcomputers.net/grid1101.html

https://en.wikipedia.org/wiki/Grid_Compass

https://history-computer.com/ModernComputer/Personal/Grid.ht...

Everywhere says it's electroluminescent. I mentioned "burning out" instead of burning in. I know it's not the most technical term, but I mean LEDs failing in displays. In your speedometer for your car, either the whole display with fail (probably the circuit is broken) or the whole thing works, because it's one EL panel behind a translucent screen.

Also because of the nature of an ELD's components being static, (they radiate when a charge is applied, in a way very different to the phosphors that move around in CRT and plasma screens) I don't think it's possible for them to burn in.


Yes, I had one of those Toshiba 3100, they were more properly "transportable" or "luggable only", as they had no battery.

And notwithstanding that, they anyway weighted some 5 ot 6 Kg (but they were tough and had a "proper" keyboard).

It is not clear if the Grid Compass was EL or plasma, here: http://www.old-computers.com/museum/computer.asp?c=900 they talk of amber plasma and surely the photo here:

https://history-computer.com/ModernComputer/Personal/Grid.ht...

https://history-computer.com/ModernComputer/Personal/images/...

looks a lot like amber plasma (as the Toshiba 3100 and other "portables" of around that time).


It also appeared in the movie Aliens as the portable terminal used to control the sentry guns.

http://www.starringthecomputer.com/appearance.html?f=728&c=2...


I worked at a firm who’s stated goal was for all the street to have the Terminal on one monitor and our product on the other. Our application was sophisticated and advanced, but it was a web app. The Bloomberg didn’t really do what the app did, but it felt so much better. Responsive, native, dense, and has keybindings. Using the Terminal makes you feel powerful and in control: it is extension of yourself.

Our app on the other hand looked and felt like all the other “modern” web apps: low information density, slow, no keybindings, and frustrating. Also it logged you out every 30 minutes! It was a total joke and we had a hard time getting people to use the product.

I think companies make the mistake on just focusing on features, especially with web apps. Performance is absolutely critical and the #1 prerequisite for a good product. But despite research and studies on how users care about performance a lot, developers don’t seem to. Every repeats mantras like “programmer time is more important”, “... the root of all evil”, “just wait a year or two and your app will be 2x faster!”, etc etc.

With the end of exponentially increasing single threaded performance, I think developers are recently starting to realize how valuable performance is. Hopefully this trend continues and we software engineers start making software we are actually proud of.

The current state of development and developers is inexcusable and sad. The slow crap we make is honestly sickening.


It has fractions of 1/2, /4, /8, /32, /64. But not /16.

There must be an interesting reason why, anybody know?


Even to this day US treasury bonds are priced like that. It's starting to get silly now with 'quarters of a 32nd of a dollar' and such (ie. 1/128, but written as 1/4 1/32 due to the limits of human comprehension).


Specifically, the limits of trader comprehension, which are somewhat more restrictive.


Okay there, Super Fractions Fella.


There’s a 1/16 on there.

I assume it is a historical oddity. NYSE prices were done in fractions up until a few years ago.


Bond futures still are [1]:

* Minimum Price Fluctuation: One thirty-second (1/32) of one point ($31.25), except for intermonth spreads, where the minimum price fluctuation shall be one-quarter of one thirty-second of one point ($7.8125 per contract).*

As are some bond-like derivatives, eg MAC swap futures, which are quoted in quarters of thirty-seconds [2]. The way prices are written out is maddening - consider yesterday's settle [3]:

107'247

This means 107 and 24.75 thirty-seconds!

[1] https://www.cmegroup.com/trading/interest-rates/us-treasury/...

[2] https://www.cmegroup.com/trading/interest-rates/swap-futures...

[3] https://www.cmegroup.com/trading/interest-rates/swap-futures...


It has its origins in the use of Spanish dollars and various related bits of gold that could be reliably cut in half, in early American history.

I think a lot of companies got away with using binary floating point maths without introducing errors, for assets priced in those power-of-two fractions.


I'd love to know what technologies the Bloomberg Terminal, and the host side, were originally implemented with, back when they started in the 1980s. I've searched for information on this topic, but haven't had much luck finding answers.


They used https://en.wikipedia.org/wiki/Interdata_7/32_and_8/32 and the Interdata OS/32 operating system. They wrote all the code in Fortran. There is a clue about the rationale for that in the Bloomberg on Bloomberg book: from memory it's something like that they didn't want the (now defunct) investment bank they'd come from to think they'd copied anything so they used a different OS and language. I think they did some pretty interesting things with their own networking technology to get all the market data around the place back then but I know nothing about any of that.

One interesting thing about the Interdata systems is that they were the first non-PDPs to run Unix, via a guerilla porting project done at a university in Australia. Bloomberg used the native OS/32 though, and somewhere there is a paper from that Australian university that gives a scathing review of it. Some kind of timesharing, but with a single console where all output would appear. That made me laugh, because the 'console room' was Bloomberg lingo for a kind of sysadmin group.

Later Bloomberg used a lot of different commercial Unix variants. At least HP/UX, AIX, Solaris and RHEL are used, and a lot of C++, Javascript and other languages. There is still plenty of older code from the Interdata OS/32 days in service though and a lot of unusual system control scripts and commands from that defunct system. I guess (?) it's probably nearly all Linux by now though!


I was tasked with calculating OAS durations on callable agency bonds with the Bloomberg API in Perl in the 90's. Basically I'd take the price and use the API to get the OAS, then bump the OAS up and down a basis point, calculate price up and price down and get the difference.

The problem was sporadically I'd get a price back that was out of line. It would match to the 5th or 6th decimal, but then would diverge. I could make the same call 10 times in a row and the price would be identical 8 times, but 2 others it would be different in the 5th or 6th decimals. Since I was taking the difference of two prices which were relatively close otherwise, this would wreck havoc with my results.

Bloomberg support was always quick with a response, but often the first level support had no idea what they were talking about. For two weeks they kept insisting that because the yield curve I was pricing with was live, what I was seeing was because of market moves. Finally I got to someone who was on the implementation team for that bit of code, and he explained that they had both Data General and Perkin Elmer (Interdata) machines supporting that function, and the answers depended on which architecture handled the response.


When I worked there (2003-2007), first level support was where new hires learned the ropes before moving on to their actual assignments.

It was an amazing way to keep employees well in touch with customers' needs, and at least at the time the program seemed to be well-enough run to deliver great customer support. Sorry to hear your experience was different.


> somewhere there is a paper from that Australian university that gives a scathing review of it

You are probably thinking of this: http://bitsavers.informatik.uni-stuttgart.de/bits/Interdata/...


That's the one, a great story!


The Interdata (PE) OS/32 manual is here, for those interested: http://www.dvq.com/oldcomp/interdata/interdata-os_32_mt-pgm-...


The pre-PC ones were basically SBCs running an OS provided by a certain well known CPU vendor. Original code on that side was PL/M before the whole thing went to Windows / C++.

Host side was mentioned in a sibling. It was all serial modem muxes back then, 300 baud up to 4800 baud backhaul.

Obviously it looks nothing like that today :)

Edit: from the man himself https://m.youtube.com/watch?feature=youtu.be&t=10m8s&v=WsRpY...


> The pre-PC ones were basically SBCs running an OS provided by a certain well known CPU vendor. Original code on that side was PL/M

I am guessing that you are talking about either Intel's ISIS or RMX operating systems?


Yes, RMX. Using a 2020 lens (pre-PC 80s were the Wild West, not diminishing any engineering effort there) the systems were straightforward. When PCs came about and everyone else was making mostly desktop software, all the real value (apps & data) remained from the very beginning “in the cloud”. It would be hard to predict that model would be exactly what everyone was clamoring for 30 years later.


C and FORTRAN.


The last one is the most uninspired & least interesting of the bunch.


The article works best if you read it as a tale of a company being dragged, completely unwillingly, into working the same way as the rest of the world does. It starts off with all these funky ideas, and slowly devolves into a bog-standard keyboard with a few extra keys stuck on.


It still has some interesting features though!

It includes a slot for a Biometric auth unit (B-Unit).


When I got the last one, I actually laughed.

It's an MBP keyboard


But it works better if you use it for anything else but the terminal. Programming on the previous one wasn't a great experience, the current one is a pretty decent keyboard (my opinion at least).


Any idea what the switch mechanisms look like? I'd like to dream that you can customize your keyboard with an array of cherry switches to your own preference.


The big white one and the current black one are both rubber domes. I’d used the big grey one about 10 years ago. I want to say they felt different but I’m pretty sure they’re rubber domes too.


The current one "Starboard" feels like rubber domes.


I expected that "timeline" to end in the late 90s but huh, that is quite impressive that they still have a market for those things.


They make a lot of sense for terminal users. And the newest version is actually a very decent keyboard, so you don't have an upside by using another one. Many departments that have some terminals will just use those at every computer so that employees with a terminal license don't have to change hardware if they swap places (since Bloomberg anywhere, licenses are on a per-person basis, not bound to the computer).


All yours for 24,000 USD a year.


The fingerprint scanner was always a curious addition to me. It seemed completely useless other than being a great way to justify the $25k spend a year on a terminal.

I'm also very curious as to whether it actually works. Is the biometric authentication properly configured to actually be more secure?


The fingerprint scanner is intended to benefit the company, not the users. It makes it significantly more difficult to share an account, which protects Bloomberg's $24,000 / user / year revenue.


One doesn't pay per user, but only for the terminal. I.e. you could have thousands of users per terminal. Per user licenses are called "Bloomberg Anywhere" which is quite similar to Refinitiv's EIKON.


Bloomberg has two different forms of licenses. Traditionally, licenses used to be bound to a computer. With mobile phones becoming more popular, they now tend to be per person (you can still get the terminal ones though). To ensure that people don't share accounts, you need to identify with your fingerprint . You can do that either using the keyboard or with a portable B-Unit to generate a token. Without either of these, Support will generate a code for you but only give you 24h or so until you need to verify with a valid B-Unit.

At $24k a year plus exchange fees, companies are very eager to minimize the number of licenses so Bloomberg is quite strict in enforcing these limits.


Bloomberg Terminals connect to market liquidity. The fingerprint reassured that the user is the person execution on any and all orders.


Is anyone using a 2004 era CTB100 at home?

I love the extra buttons and that it comes with a speaker for voice chat.

I'm considering using one for programming, with the extra buttons remapped.


I saw one on eBay and I'm seriously considering it. I've found nothing online about using them in Linux, but I'm willing to give it a shot. I'd wager it shows up as a big pile of USB endpoints connected to a hub or somesuch, might be interesting to see what you can get working.


Quite a few of the keys have awkward action with excess stiction. Not great for programming (and indeed many Bloomberg programmers used other keyboards when that one was current).


excess stiction - so you means the keys requires a lot of strength to be activated.

Could it be the plastic getting older and less soft?


The latest iteration is rather underwhelming. I hope they're going back to some bigger designs again!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: