Hacker News new | past | comments | ask | show | jobs | submit login
High-Speed Trading Firm Deleted Some Code by Accident (bloombergview.com)
75 points by fahimulhaq on Oct 2, 2015 | hide | past | favorite | 60 comments



In finance you'll find low accountability for developers. Basically finance is where great developers go to get paid a lot to become horrible developers. Very few companies in finance are pushing the bar on quality code since they culturally cannot accept the cost of that. It doesn't make sense until it is too late and by then their code base is too entrenched.

I've seen this at countless finance companies, and from interviewing people across the entire spectrum of finance. It is a rare corner of the industry where you'll find quality engineering. Usually everyone in a finance company thinks they are amazing engineers, and are just plain wrong.

My advice: work smarter, not harder.


At the proprietary trading group where I work, the three core developers own the company. And we care greatly about the code quality, as it is our money on the line.


I can confirm this. I just wish I had known about it before I accepted my current contract...


Get out while you can.


But is it enough to put a company that doesn't do that at a competitive advantage? Is this really just company culture and assumptions, or are there actual market forces behind it?


It is a mixed bag.

In finance technology is seen as a cost center, not a profit center. They want the minimum needed to ship. Early on this results in a lot of tech debt. Later on, fixing the system is too hard and thus they need to hire super smart people to figure out the mess they are in for any code change.

Enforcement actions are INCREDIBLY rare. The penalties are shockingly low. Thus the cost of failure to comply is low, and the businesses are not prioritizing compliance.


At a company like Tower, technology is their edge and definitely a competitive advantage. Having a system that is fast, reliable and scales is key to running a profitable automated trading operation. I worked for a similar trading company and technology was definitely not a cost center.

If enforcement actions are rare, it's probably because these firms are playing clean for the most part, contrary to popular belief. If you trade 5-10% of the market, regulators are going to audit you every year. They are looking for even the smallest issues to give a fine, even minor things like not having the right Head Trader's name on your supervisory forms after someone quits.


The best part of this article is the feature graphic, that of a castle with the caption:

THIS IS A TOWER. PHOTOGRAPHER: DAMIEN MEYER/AFP/GETTY IMAGES

The article mentions "Tower Research".

Is the image/caption a joke, auto-generated by content, or just plain laziness?


It's a joke, consistent with Matt Levine's writing style. See for example the image and caption for his article on Panda Express: http://www.bloombergview.com/articles/2015-09-24/panda-expre...


Also, 'La Tour' means 'The Tower'.


Every time I read stories about high speed trading, I have to wonder what is actually the point of high speed trading to the society. The point of a free and fast stock market is to provide monetary liquidity and stabilize the economy by shortening the feedback loop. But something tells me that our society and economy derives zero benefit from nanosecond resolution trading vs. trading with say 1 minute resolution. Meanwhile, the market is exposed to wild swings which might be smoothed over or even prevented if markets worked a little slower.


Faster trading allows market makers to operate with smaller capital investments. This lowers the barrier to entry in the market making business, creating more competition. More competition reduces the costs of trading. (And not just in theory. Market making used to be exclusively the domain of a few big players. It's not anymore.) Reduced cost of trading matters to almost everyone who's involved in the markets. Passive index funds, for example, have to trade to reinvest profits and to rebalance their composition. That's why Vanguard -- the most evangelical of the passive fund managers -- firmly defends HFT.

You may ask, well, why do we need market makers at all? Well, you don't have to use them. You have other options for trading. Call up your broker and ask. What you'll discover is that these other options have far larger transaction costs.

A lot of the complaints about HFT I've seen on Hacker News seem to think that no one should get paid for market making. I don't think that's possible. (And if it is, I'd love to know how.) Transaction costs are an inescapable part of trading; ultimately, you're faced with the choice between tilting your hand to other traders or paying some intermediary to take on risk for you. Generally, it costs less to choose the second option.


How does HFT help market making? Would a market that operates with a 1 second resolution be inherently unable to provide the same levels of liquidity that we enjoy today?

You argument is that Vanguard is a market maker, Vanguard likes HFT, thus HFT must be good. That doesn't really follow at all. Vanguard could be arguing in favor of HFT for all kinds of reasons, one of them could be that they have the money to rent server space right next to the main exchanges, and pay for the ludicrously low latency connections to the exchanges, so they benefit from HFT by being able to be quicker than others, and they don't want to loose there edge.


No. Vanguard is a large participant in the markets because it manages funds for long-term investors. Markets have bid and offer prices where you can instantaneously sell or buy shares, and these bids and offers are often provided by professional market-makers. Before automation, these were humans on an exchange floor. They could only trade a single stock at a time, hence the name "specialist", since they specialized in that issue. Volumes were also lower. To make enough to earn a living, that human on the exchange floor had to charge a big spread between buy and sell prices. For a long-term investor, this spread is an additional friction cost that drags on their returns when rebalancing portfolios.

Now a small team of researchers can make markets on many securities at once. They don't have to make as much money on each, so the spreads get smaller. Additionally, instead of a monopolistic specialist, these professional traders compete to win transactions by making better prices (higher bids and lower offers) than others. Many market makers try to keep a neutral "book" of positions, so if they sell some GS they may bid higher in other bank stocks to reduce their exposure to market risk. The faster a market maker can update his quotes or hedge when conditions change, the less spread he needs to charge.

http://meanderful.blogspot.com.au/2013/01/hfts-dirty-little-...


Ummm... No. That's not remotely what I said. Vanguard is not a market maker, and definitely not a HFT shop. Vanguard is the world's largest mutual fund company. If you're not clear on who the major market players are, you should probably stop worrying about the arcane details of which middlemen these players use to interact.


Vanguard is like the opposite of a market maker; they're a mutual fund manager. They're the people HFT is ostensibly supposed to be screwing.

See, again, per the article, Chesterton's Fence.


> But something tells me that our society and economy derives zero benefit from nanosecond resolution trading vs. trading with say 1 minute resolution.

Not only does nanosecond resolution provide zero benefit, models indicate that it actually harms liquidity. To be specific, serial order processing in continuous-time is more efficient in "time-space", but less efficient in "volume-space". The better mechanism is batch order processing in discrete-time (i.e. process all arriving orders simultaneously in batch every 100 milliseconds, rather than one-by-one every nanosecond): http://faculty.chicagobooth.edu/eric.budish/research/HFT-Fre...


What about derived liquidity? A lot of liquidity in ETFs, futures, foreign exchange, dual-listed stocks and interest rate products works this way. Traders will do things like buy futures and sell correlated ETFs to hedge, buy ETFs and sell the basket of stocks it holds, and so on, as a way of making markets.

Because the markets are fast, there is less risk that their hedging product will "run away" from them before they can execute. Since the risk is low, they can make a very competitive and tight market. If they had a 100ms delay to hedge, they would need to make a bigger spread to compensate for the risk.


I don't think high frequency trading increases the probability of dangerous swings. I don't know if they add any value, but I doubt they cause all the problems people blame on them. When they do start doing something stupid, then they will lose money and be punished and other traders will profit from them.


the premise is they might add value in some circumstances, but in the mean time they extract value 24/7

>When they do start doing something stupid, then they will lose money and be punished

hehe no, trades will be reversed, people that "maliciously took advantage of poor algo error" will be punished instead


I think wild market swings have been happening as long as markets have been around. I don't necessarily see the connection between HFT and those swings. Here's a thought: If you assume that swings are going to happen regardless of HFT, then is it possible that HFT actually helps find the real "fair value" level in the market faster than if those player's did not exist?

Another thought on HFT's value to society: I generally agree that HFT does not provide "direct" value. There are clearly "arbitrage" opportunities that HFT firms consistently profit from. However, by definition, those arb opportunities will either be taken by someone else, or enough people will run into grab them such that the arb goes to zero. If there are consistent arb opportunities then something needs to change - namely the regulation which structures the market and allows people to take consistent advantage of it. Thus new regulation is issued and the market, inherently a complex and intricate system, becomes stronger as a result. This is maybe where the value to society lies. and maybe this is an idealistic view of how regulation, society and HFT interact in this case.... ?


You can only eliminate cross-market arbitrage by forcing everyone to trade in one place. Exchange competition lowers costs for everyone so it's not clear that this would improve things. Even on a centralized exchange, near-arbitrage would still exist between very correlated products within it or between similar but not identical products in other places (index futures vs. ETFs, different grades of oil, US treasuries vs. European bomds).

Like you said, if enough people do the arbitrage, it converges to zero very quickly. There's little profit in it and everyone reaps the benefits of near-instantaneous accurate pricing.


Doesn't matter how much money is involved or the quality of a team, someone somewhere will always do an oopsie. Maybe that particular 'code' should have had a little comment such as // used by ISO stuff do not delete

Commenting tricky / non-obvious code is well recommended.


Some unit tests would be nice. Maybe even a special folder full of tests that know when laws have been broken.


Probably exists now, and not before.

Writing this software would be very scary for me -- I'm confident in my work for the most part but I'm only human. To then let a "for-loop" go wild making thousands of trades per second on the entire market with real money against other high frequency traders no less.. man... you gotta be really really really really sure of your code!


It is not about confidence. It is about ownership and being a good engineer. A good engineer would have cared about the code base and ensured that a test would fail, a use case would be linked to that failure, and someone would see that they have now violated a regulation.

Then if the regulation changed, someone would grep through the use cases, and see what behavior is occuring/ how to modify the code base to cope with the change in regulations (this occurs a lot). Of course most finance firms don't do this and actually violate regulations all the time. There is literally a gold mine for the SEC/FINRA to go after, but the government is fairly incompetent at mining through the data and holding companies accountable. The reason being that they don't pay enough to hire the talent that would uncover the violations.


Real engineers are like true Scotsmen, because you can never have enough process.

In this case, of course tests really do prove the absence of bugs rather than just (not) proving their presence.

I really don't like the whole "unit tests lets you not worry about breaking things" view. That's not confidence, it's overconfidence. And it doesn't even mostly work for things where "correct" is more fuzzy than "eventually produces this exact output".


It is confidence. It is the right tool for the job.

What would you propose as an alternative?


I'd be surprised if they didn't make postmortem changes to prevent such incidents in the future. Your posts in this thread imply that no mistake should ever happen, but let's be real - mistakes will always happen, the key is to do proper postmortem analysis and learn from them (and obviously prevent recurrences).


Some mistakes, yes. But this was a case of such a basic and clear cut use case that should have been tested.

My understanding is that companies such as this rely purely on manual testing. Thus it is easy for things to slip through the cracks. Usually they never get hit by penalties or get caught so the pressure to do better is not there.


Yeah, well, fkups happen, you just gotta make sure you are making more money than you lose in those incidents.


Or you just write a test that checks to make sure you don't screw up.


If you're designing a new airliner then yeah, you can't screw up, and you end up taking years to deploy.

In the HFT world, you may have only days, or sometimes hours, to deploy. Yes you still write tests, and a bazillion safety checks when you deploy, but if you manage your risk down to zero then this is definitely not the most profitable way to go.


Or how about at least some code documentation. In this case the programer deleted the code because he didn't understand why it was there. Well a simple "# Needed to Comply with regulation X" would probably have prevented this problem.


There's a specific incantation required to send the routed ISO orders via BATS as mentioned in the article. I can see how a refactoring might accidentally change that.

With that said, libraries that send orders to exchanges are required by the exchanges to be certified against their expected behaviors. If you want to send routed ISO orders, you have to test that with them. When making changes to said libraries, it can often be good to re-certify to ensure things are still working as expected.


Acceptance testing fail! When changing code, and especially code that has consequences, doing so without tests is a bad idea. Doing so to protocol code without conformance tests is doubly bad.


Sadly, this is the norm in finance. You would be shocked at the shittastic state of the companies that dominate in this industry. They are literally a house of cards waiting for a strong gust of wind.


...I'm missing something here. How do they not know how much they've asked to trade and not heard back on yet? Wouldn't they have to track that, to not try to trade the same exact shares the same exact way a bunch of times? Is their local system maybe just more synchronous than I'm assuming would be possible?


Can someone explain why the SEC rules insist you must buy from the exchange with the lowest price first? It appears to me to be a rule just to keep the small exchanges alive.


To stop brokers screwing their clients by routing orders to their mates.


Otherwise a broker would just route mom n' pop's orders to the exchange that pays the biggest bribe to the broker (and screw mom n' pop).


Yes but for proprietary traders using their own capital, it doesn't make a lot of sense to force them to trade in a particular way since executing sub-optimally can only impact their own performance. There are legitimate reasons not to trade the lowest price market (maybe you believe the quote is slow, want to trade bigger size all at once, don't want to risk signaling to the market): http://www.hudson-trading.com/static/files/RegNMShudsonriver...

And retail brokers already do what you suggest. All their marketable flow gets sold to off-exchange market makers, while their limit order flow gets routed to exchanges that pay the highest liquidity rebates, Reg NMS just mandates what price it can trade at: http://news.indiana.edu/releases/iu/2014/02/study-of-potenti...


On a related note, Knight Capital lost $440 million due to a software bug a few years ago.

http://www.bloomberg.com/news/articles/2012-08-02/knight-has...


Blew up the whole firm actually, and was more closely related to change management policies and systems monitoring.

The SEC order there is an interesting read for anyone involved in large scale transactional systems: https://www.sec.gov/litigation/admin/2013/34-70694.pdf


Given hindsight, it's always easier to ask for more tests or code comments. I guess the rule of thumb is that if the code is there, it most probably is there for a reason.

When something like this happens, it's always a series of oversights.

1. Someone forgot to add comments or test cases while writing the code (or perhaps he wrote it but people are not running the test cases before commits).

2. Someone else thought that the code is dead and deleted it (and skipped the test cases if there were some covering that particular scenario).

3. Either the person deleting the code didn't wait for the code review or code reviewers missed that as well. (If code reviews are non-existent, it's a disaster waiting to happen).

So in the hindsight, write comments, write test cases, run test cases and do code reviews carefully.


Move fast. Break things. Pay millions of dollars in SEC fines.


A change like this making it through to production is indicative of a test failure, either automated or human. I have had the luxury of substantial human test teams on trading systems before 2008, and it was great. They wouldn't let something like this through...


Who doesn't notice a huge deletion in one changeset? This seems to be a convenient excuse. The SEC report.

http://www.sec.gov/litigation/admin/2015/34-76029.pdf


I don't know much about stocks trading, can someone ELI5? Thanks.


Basically imagine you want to buy apples from 3 different fruit stands in different parts of a city.

Fruit stand 1 is the cheapest, fruit stand 2 is the 2nd cheapest, and fruit stand 3 is the third cheapest.

You want to buy all the apples out there.

The rule is that you have to go to the cheapest fruit stand first, and exhaust their supply before moving onto the next cheapest fruit stand. If you want to go to ALL the fruit stands at the same time you need to tell them you are an 'OK' guy and are really buying fruit everywhere.

Well, someone deleted code and ignored one of the fruit stands. This violated a rule and cost the fruit buyer a lot of money in penalties.


Why does the SEC care that you bought shares at a price higher than the cheapest? How does that hurt anyone other than yourself?


If I ask my broker to buy some stock, I want it at the lowest price. My broker, left to his own devices, may prefer to buy at a higher price because it is more profitable for him.

This rule seems designed to protect the consumer when the consumer and the stock brokers goals are not aligned.


The article says they only do proprietary trading, which I thought means they only use their own money. I guess that's rare enough that the rules apply regardless in order to simplify things?


The rules apply to the whole system -- notably exchanges, the exchange "members", and broker-dealers. Latour is a broker-dealer (BD) and thus have to play by the rules, even if they don't have customers.

If they weren't a BD, their BD (which is required to ultimately reach the markets) would be on the hook for letting this happen. ISOs are something that get a lot of compliance/regulatory scrutiny and most BDs don't let their customers use them. If they are allowed, there's a lot of reporting/transparency/surveillance to ensure they are issued properly. And as you can see from this fine, there are a lot of rules and corner cases one must cover and apparently even the best firms mess it up one way or another.

Latour's previous fine (read Matt Levine's continually excellent articles about this stuff) was for "Net Capital Requirements" which are designed to keep customer money safe, particularly in the presence of other customers who might blow up ... of course Latour has no customers, but it is still the rules. [Although IMO this one was a grey area where Latour probably had better models than what the rules required, but since they didn't match the rules properly they got fined.]


The person offering the cheaper price is hurt, because someone offering a worse price is getting filled before him. Essentially, the moral argument of Reg NMS is to enforce price-time priority in a distributed system. Whether it actually accomplishes that is...well, I'll leave that up to you to decide.


It enforces price priority, not price-time priority. Exchanges can use pro-rata, etc. at the same price level; and if multiple exchanges offer the same price level, a BD is free to choose any of those to route to.


These rules apply when you are trading with money that is not yours - as a fund manager, hedge fund trader, broker etc.

Intermediaries make up 99.9% of the market because very few individuals would bother paying to connect to a dozen stock exchanges if the can pay a broker a few bucks to do their trade instead.


And relatedly, what is the point in the existence of multiple markets, if not that you can choose which market to trade on for your own technical–logistic reasons?


You can, but only at the same price. It's a weird system, especially when markets are located different places, so from each trader or exchange's vantage point, things are in different states. Working around this friction led to an explosion of special order types that some people argue are unfair or overly complex.

In Europe and Asia, there are competitive exchanges, but no rule to force people to route there. There are some guidelines for brokers with customer orders to have a duty of "best execution", but for proprietary traders or professionals, they can do whatever they feel is ideal.

And guess what, there are very few "trade throughs" or cases where markets invert with each other despite lacking such a rule. People usually act in their economic best interest without a law mandating they do so. If it happens, it's an arbitrage that someone will eliminate very quickly.


Another reason code needs to be idiomatic.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: