1. HFT: In Search of the Truth

    Statistics is a funny field. You can generally construct well-reasoned, well-supported arguments on both sides of an issue. I have found a paucity of independent, objective writing on the subject of market quality, and recently found myself researching what studies have been performed. Reviewing the literature on market quality, it is clear that nearly all studies of equity market quality have been done by either:

    1. HFT firms
    2. Exchanges (whose best customers are HFT firms)
    3. Academics sponsored by HFT firms or Exchanges

    Part of this problem is the data, it’s simply immense. It has helped to coin the term “Big Data.” And you know it’s bad when a bunch of computer scientists call something Big.

    This is not just a question of acquisition cost, though it is certainly costly to obtain this data, irrationally so considering this is data about public markets. It’s also a question of technological infrastructure and expertise. Generally it takes a lot of storage and computing resources to store and analyze this data, and those who know how are generally making large amounts of money in the industry. Or they’re looking to get a job making large amounts of money in the industry.

    I’d like to start this article by reiterating my call to the SEC to open up the market data. Open up access to MIDAS, the SEC’s quantitative analysis platform, to academics and independent researchers! Embrace the principles of the open source movement, and make it cheap and easy to perform studies on market data with the goal of advancing the public discussion and regulatory decisions. The SEC team at the head of the MIDAS project is talented, but small and resource-constrained. I have called for this in my Senate testimony and my public comments on the SEC Technology Roundtable. Open up the data! There’s really no good argument against it.

    That being said, it is fruitful to examine evidence and studies that are not produced by the three groups mentioned above. Is it ironic to decry the manipulation of statistics in one sentence, and then to use them to prove your case? Probably. Disingenuous? I don’t think so. After all, each of the parties mentioned above has a distinct book to talk, a P&L they’re responsible for, whereas I’m seeking independent analysis and trying to improve markets. The case could be made that I’m as guilty as any of them. Clearly, I won’t let that stop me from trying.

    Now, it has become common practice in this industry to accept the fact that HFT has improved market quality by providing tighter spreads, a higher number of shares (depth) in the book and reducing volatility. But do independent studies bear this out?

    I would like to examine this question in detail, but first I’d like readers to consider something broader. The market has clearly transformed over the past 2 decades. Many of these changes were extremely beneficial, including decimalization, improved order routing algorithms and increased competition. Some of these changes are more subtle, and far less studied.

    In an interesting paper published in 2010, Reginald Smith asked the question “Is high-frequency trading inducing changes in market microstructure and dynamics?” He went on to demonstrate that while historically, the Hurst exponent of the stock market has been measured at 0.5, consistent with Brownian motion and our general understanding of how the market functions on small timescales, that value has been increasing since 2005 and Reg NMS. While this is certainly an esoteric statistical discussion, his conclusion is striking:

    “we can clearly demonstrate that HFT is having an increasingly large impact on the microstructure of equity trading dynamics. … this increase becomes most marked, especially in the NYSE stocks, following the implementation of Reg NMS by the SEC which led to the boom in HFT. … volatility in trading patterns is no longer due to just adverse events but is becoming an increasingly intrinsic part of trading activity. Like internet traffic Leland et. al. (1994), if HFT trades are self-similar with H > 0.5, more participants in the market generate more volatility, not more predictable behavior.”

    A market that is becoming more self-similar is a market more prone to positive feedback loops, illiquidity contagions, and generally non-linear behavior. It’s a market that is less resilient and more prone to instability. It’s a market that none of us want. It’s a market that takes our entire body of academic theories on how capital markets operate, and overturns the most basic underlying assumption. I don’t mean to get bogged down in this discussion, but I’d like readers to keep it in mind as we discuss market quality.

    The spread between the prices that market participants are willing to buy and sell stock at is often the most cited statistic out there. HFT proponents scream about penny spreads and how dramatically they’ve improved spreads in the market. It is questionable whether spreads have actually tightened, both in absolute terms and when adjusted for volume. For example, Watson, Van Ness and Van Ness (2012) using Dash-5 data find that the average bid-ask spread from 2001-2005 was $0.022 (2.2 cents), while the average bid-ask spread from 2006-2010 was $0.027 (2.7 cents), a dramatic increase of 22.7%. This would suggest strongly that spread improvements are more a result of decimalization and pre-NMS computerization than HFT efficiencies.

    Kim and Murphy (2011) did an examination of model-implied effective spreads. They looked at four common models used to calculate effective spreads and found that all four models underestimate spreads by 41% - 46% for 2007-2009. Using their model, which takes into account volume and therefore provides a true apples-to-apples comparison of spreads across different time periods, they find that spreads between 1997 – 2009 are actually quite similar. The size of trades are much smaller in the era of electronic markets and HFT, as institutional orders must be broken up dramatically to try to reduce market impact. You must therefore collapse consecutive buy/sell orders into a single transaction to evaluate the effective spread of a trade relative to spreads from the days where trades were larger. 

    Their analysis focused on SPY, the most liquid instrument in the market. According to them, “the average size of a buy or sell order in 1997 is 5,600 shares, while in 2009, it is only 400 shares.” Therefore any study that claims to compare spreads between these periods without adjusting for volume, or simply looking at instantaneous bid-ask prices, is flawed. It also means that nearly any study that claims spreads have tightened in a post-NMS world, but neglects to account for trade size changes (as every study that I’ve seen does) is fundamentally flawed.

    Another excellent study led by RBC’s Stephen Bain demonstrates that in Canada, spreads have indeed tightened post-2009 (these are nominal spreads, not effective spreads), but that this has come at a cost in terms of volatility. They found thatshort-term price gyrations for individual stocks, … have effectively doubled from pre-2000 levels to present.” They conclude that “the behavior incented by today’s market has increased effective spread costs for investors by eroding the quality and reliability of the liquidity provided.”

    Chart from RBC Capital Markets: Evolution of Canadian Equity Markets, February 2013.

    Which brings us to our next measure of market quality, volatility.

    Volatility

    Volatility is one of those things that is difficult to study. It’s certainly not difficult to measure, but to be able to attribute it to individual causes is a challenging exercise. We know that HFT thrives at heightened levels of volatility (though not too high!), and therefore extracts higher rents. We should examine the claim that volatility has dropped as a result of market structure improvements and HFT.

    A cursory look at the VIX calls this story into question. From 1996 – 2003, the VIX oscillated but maintained a baseline in the 20’s. From 2003 – 2007 it dropped steadily. We’ve obviously had some interesting and non-standard market conditions since then, but from a steady-state perspective the VIX since early 2009 looks remarkably similar to the VIX during the post dot-com crash period of 2003 – 2007. I believe the VIX to be a less interesting and accurate gauge of volatility, and much prefer examining intraday volatility to better understand what’s happening on a daily basis in the market.

    Once again referring to the study by RBC, they demonstrate conclusively that volatility has increased in Canadian markets, and has been doing so since 1996. In a very compelling illustration of this they present “a histogram of the distribution for four date ranges

    covering eighths trading, nickel trading, decimalization and finally market fragmentation and proliferation of maker-taker.

    For the identified time periods we can see that that the distributions for more recent periods exhibit a higher mean and more positive skew.”

    Chart from RBC Capital Markets: Evolution of Canadian Equity Markets, February 2013.

    This is a histogram of the “distribution of trading prices relative to short-term equilibrium”, and demonstrates that each of these spread-narrowing innovations has been associated with an increase in relative volatility.

    Another problem in volatility analysis is related to spread analysis. RBC found in that same study that 60% of their order flow is traded by 11am, and volatility during this first hour and a half of trading was far higher than in the past. Failing to adjust volatility calculations for volume or time, and averaging over the entire day, is a flawed methodology, but one that most pro-HFT studies use.

    Nanex has done some excellent work in volatility analysis and their Relative Intraday Volatility (RIV) measure gives us a good indication of the trends in the US Markets. They show a distinct trend of increasing intraday volatility since Reg NMS was passed:


    Chart from Nanex: http://www.nanex.net/aqck2/3563.html

    This chart shows that since 2007 RIV in SPY has more than doubled, with peak intraday volatility 10 times higher in August 2011 than in 2006.

    I would argue that the increase in the Hurst component mentioned above is one major cause of increased volatility. Another excellent paper may hold another key to this increase, as well as the increase observed in the US. Dichev, Huang and Zhou (2011) tie increased trade volume to increasing volatility. They looked at data since 1926 and cross-sectionally over the last 20 years and found that “in recent years trading-induced volatility accounts for about a quarter of total observed stock volatility.” Their conclusion is that as volume increases, so does volatility. And if there’s one thing everyone can agree on, it’s that HFT has increased volume. Lest anyone make a mistake, even the SEC has admitted that increasing volume is not the same thing as increasing liquidity, and the Flash Crash is the perfect illustration of that.

    The Head of Financial Stability at the Bank of England, Andrew Haldane, found in his July 2011 study that “intraday volatility has risen most in those markets open to HFT”. He also noted that “HFT algorithms tend to amplify cross stock correlation in the face of a rise in volatility”, meaning that individual risks are made more systemic, and particularly so during times of market stress.

    While examining intraday volatility is useful, proponents of our current market structure will also point to many studies arguing against the ones that I’ve presented (although most would fail one of the 3 points laid out at the beginning of this article). At the very least the takeaway should be that this is a much more nuanced argument than is commonly accepted. At worst, other studies are published by those incentivized to maintain the status quo in market structure and are flawed.

    Of course, this examination of volatility says nothing about the increase in catastrophic technology failures, and so-called “black swan” events. The complexity of today’s market means more fragility and is prone to these catastrophic events. It seems to be only a matter of time before we see another one.

    Conclusion

    This article is not meant to vilify High Frequency Trading. It is rather to shine some light on the changes in market quality during the most incredible regulatory and technological transition that most of us have witnessed. The primary force at work here is Regulation NMS in the US, and one of the consequences of this regulation was summed up by Eric Noll of Nasdaq at the Market Structure Roundtable on May 13, 2013. He stated that speed has become the defacto differentiator after price as an unintended consequence of NMS.

    This is something that I will examine further in subsequent articles, along with recommendations on how to disincentivize speed and end the race to zero latency. This drive to zero has had extreme consequences in terms of market complexity and fragility, with little benefit to the investing public. That is not to say we must wax poetic of the days of the specialist, it is simply to state that there is a far better market structure out there than what we have, and we should seek to structure proper regulatory incentives to gently nudge the industry in that direction. That begins with reducing complexity (reexamining market data fees and revenue distribution along with the trade-through rule is a great start here) and reducing the value of speed.

    Let us not fool ourselves that the issue of market quality is cut-and-dry. It is a nuanced discussion, and one that should be taken more seriously than it currently is.

     
  2. The “missing” Nasdaq Data

    Eric Hunsader from Nanex has posted the following bounty:
    We are offering $10,000 to the first person who can show us that there were no missing trades or quotes from Nasdaq in NMS stocks between 11:29:53 and 11:30:09 Eastern Daylight Time on May 18, 2012. Basically, you have to find the missing data for the gaps in the charts on this page.
     
    There is, however, a problem with this request. He is presupposing that there is missing data. I can easily address the first sentence, and show that there is no missing trade or quote data from Nasdaq for that time period on May 18, 2012. That does NOT mean that I have found “the missing data for the gaps in [those] charts”, as there is no missing data. To understand this, you must first understand how the Nasdaq order entry interface and matching engine operates.
     
    Nasdaq accepts electronic orders via OUCH or FIX ports (OUCH is the name of the binary protocol, FIX is a text-based protocol). Those orders are then sent to a central sequencer (via different paths, depending on which protocol you’re using), where they are assigned a sequence number. Once an order is assigned a sequence number, an acknowledgement message is generated by the sequencer / matching engine, and that acknowledgement is sent back down to the client that entered the order. An order is not considered accepted until it is acknowledged (ack’ed). This is an elegant design, that has other features such as only permitting one un-ack’ed order in-flight between each OUCH/FIX port and the sequencer. That acts to naturally throttle orders and ensure minimal loss in the event of a failure.
     
    Here is a diagram showing the LOGICAL flow (this is not meant to be a physical diagram, just a logical one, showing a very high-level, simplified view of the Nasdaq order entry system - there are obviously MANY other systems than what is pictured here):
    image
    On May 18, while trying to price and print the Facebook IPO, Nasdaq systems entered a feedback loop from which they would not be able to escape. Nasdaq engineers quickly pushed out new code that removed the validation loop, as explained by the SEC in their recent report on the record fine that Nasdaq has to pay. In order to activate this new matching code, Nasdaq had to failover their systems - they had to take the primary sequencer / matching engine down, and have all of their systems and clients reconnect to the new one, running the new code that would avoid the validation loop. 
    At approximately 11:29:53, Nasdaq shutdown their primary system and initiated this failover. It took longer than they expected, almost 17 seconds. During this 17 second time window, Nasdaq was not accepting / acknowledging new orders. When the backup / failover system was finally active, orders were once again accepted and market data began disseminating again. There was no burst when this happened, as there was no queue of orders built up, simply because Nasdaq was not accepting new orders or filling any existing orders. All activity ceased for that time. If there were crossed quotes in the market, they were stale quotes that were no longer valid.
     
    The evidence for this is simple. There is no sequence number gap in the Nasdaq data. If you understand how the Nasdaq matching and sequencing infrastructure works, and you know there was no sequence number gap, then you can see that there is no missing data. I have spent many years studying Nasdaq’s order entry and matching infrastructure, first as a high-frequency trader and later simply as a technology architect who is very impressed with an incredibly scalable, high-performance trading and matching system. I can say definitively, based on the lack of a sequence number gap, that there are no missing quotes or trades. There are no missing, ack’ed orders. Nasdaq was simply down, and failing over during that time.
     
    I accept checks and money orders. :)
     
  3. The Fragility of the US Markets

    "Did you hear? The stock market broke the other day. Stocks went inexplicably haywire, costing investors millions in a matter of seconds. It appears to be the result of a single market participant."

    It should frighten you to know that the quote above would have been a truthful statement roughly half a dozen times during the last 2 years. The United States capital markets have broken down. They no longer serve as an efficient system for allocating capital to companies. Our markets have become a playground for computer and data scientists who are trying to make their fortunes by writing advanced algorithmic trading strategies.

    The changes in the US markets have come over several years, but the root cause is simple and obvious - high-speed, computerized trading systems. While these advanced technologies could be very beneficial if cautiously implemented and properly regulated, they have instead overcome our ability to truly understand and control them.

    The owners of the largest market making firms have:

      • Recklessly implemented strategies that are poorly understood and not well tested.
      • Built entirely new exchanges to circumvent the restrictive regulations that tied the hands of the NYSE and Nasdaq.
      • Devoted massive lobbying resources to influencing regulations so as to benefit their style of trading.

    What they have failed to realize (or choose to ignore) is that orderly markets and responsible regulation are in their long-term best interests. Instead of embracing that philosophy and working to build a stable system with meaningful reforms they focus on short-term profits above all else, and subvert the rulemaking and regulatory process at every opportunity.

    The result is the US stock market that stands before you today, a system so complex and fragile that individual market participants can cause extreme instability and even crash the entire market. If you have a cold, you’re not allowed near the computers running the market - a sneeze could erase a hundred billion dollars of wealth.

    This should, of course, be terrifying to regulators, traders and anyone else whose livelihood depends on the efficiency and stability of the stock market. That any one individual participant currently wields the accidental power to temporarily derail an entire market is evidence that our system is woefully broken. Computer “bugs” and “glitches” make news on a regular basis. Inexplicable moves of great magnitude happen regularly on a single-stock or market-wide basis. One is left to wonder when these sorts of “mistakes” can be manufactured and manipulated for malicious purposes - or whether such foul play has already happened before our eyes.

    The most recent events on Wednesday July 31 are simply the latest manifestation of a stock market that is teetering on the edge of an abyss,

      • The Flash Crash in May, 2010 was set off by a single large trade estimated at $4.1 billion in the S&P 500 Futures Market. The cascade led to 20 minutes of extreme volatility, wiping out nearly1 trillion of market cap before quickly and inexplicably recovering. The economic cost of this event is completely unmeasured, but certainly huge. We were lucky it didn’t happen near the market close - had the US markets closed before they recovered, the result would have been total economic disaster as money flooded out of the stock market overnight.
      • In August, 2011 the stock market swung up and down by over 4.4% on 4 consecutive days, alternating up and down days. It was wild, unprecedented volatility - only the third time in history that had happened, with the second time having been 3 years prior, during the crash of 2008. While the European crisis was becoming a more important issue at the time, this volatility was not warranted by major economic changes or historic macroeconomic events. This was computer-driven volatility.
      • "Mini flash crashes" occur on a near-daily basis in individual stocks. Nanex has documented over 1,000 instances of individual irregularities in stocks since August 2011. Single-stock circuit breakers have failed to stem the tide and aim to heal a gunshot wound with gauze.
      • IPO’s in Facebook and BATS (itself an Exchange) have gone horribly wrong due to technological “glitches,” souring the market for IPO’s and costing untold thousands of jobs as companies cannot raise the capital they need to expand.
      • Few realize how lucky we were on Tuesday, July 30. An order to sell nearly $4.1 billion in the S&P Futures Market, the same size as what precipitated the Flash Crash, was executed 3 seconds before the market closed. There simply was not enough time for the waterfall of May 6, 2010 to repeat itself. What happens the next time when that same order is sent in a couple of minutes sooner? The SEC typically respond that their circuit breaks will save the day. Unfortunately, circuit breaks are turned-off during the last 25 minutes of trading.
    • On Wednesday, July 31, Knight Capital Group - one of the largest market making firms, an official Designated Market Maker on the NYSE, had an algorithm pushed out that was not properly tested. The result? A loss for them estimated at440 million, untold economic losses for retail investors with stop-loss orders in one of the almost 140 stocks that were affected, and a further erosion in investor confidence.

    It’s rather ironic: while market participants fight regulators tooth-and-nail over every proposed rule change and reform, retail investors head for the hills. Naturally, investor-flight has greatly accelerated since the Flash Crash. In fact, it has now turned negative when indexed to 1996.

    As I’ve detailed in past posts, retail investors have been suffering under the current market. The economic costs of volatility, HFT algos that step in front of large institutional orders, undermined confidence, and the complete lack of a viable IPO market are dramatic. We need to take the following actions right away, and I will detail why in my next post:

    • Affirmative market making obligations for any firm that would collect a rebate on any venue where securities are transacted: ATS, ECN, Dark Pool or Exchange, it shouldn’t matter where.
    • Impose a 50 millisecond minimum quote life on all quotes (for reference, the blink of an eye is about 300 milliseconds).
    • Eliminate pay-for-order-flow.
    • Enforce price-time priority across all venues.
    • Eliminate co-location and direct exchange feeds.
    • Take away the self-regulatory authority of for-profit exchanges.

    The beautiful visualization of fractal images belies a more terrifying reality. Fractals are a picture of a transition between two worlds - the orderly linear world and the chaotic non-linear world.

    They are a picture of the “Edge of Chaos”. The US Markets are on the edge of chaos right now. We’ve been seduced by the technological beauty of these intricately interwoven systems but are now being betrayed by the chaos and non-linearity of their interactions with each other. Some would say the tipping point has already been reached, but I’m not ready to go that far. They can be fixed, but we need to act now.

    (image credit: Pascal Agneray https://vialucispress.wordpress.com/2010/10/19/the-near-edge-of-chaos-dennis-aubrey/)
     
  4. 14:49 19th Jul 2012

    Notes: 14

    Reblogged from cowbirdy

    image: Download

    cowbirdy:

For the last eight years, Aaron Huey has been photographing life on the Pine Ridge Indian Reservation in South Dakota, culminating in this month’s beautiful cover story for National Geographic Magazine.
Aaron has been using Cowbird to help the Pine Ridge community tell their own stories directly, and starting today, you can find hundreds of their own, unedited voices on National Geographic’s website, as an embedded Cowbird collection.
We hope this collaboration demonstrates a model that many other journalists and news organizations will follow for many other communities — using Cowbird as a way to give those communities a voice alongside the “official” account of their story. If you’re interested in piloting a project with us, please email us at hello@cowbird.com.
You can browse the voices of Pine Ridge here — enjoy!

    cowbirdy:

    For the last eight years, Aaron Huey has been photographing life on the Pine Ridge Indian Reservation in South Dakota, culminating in this month’s beautiful cover story for National Geographic Magazine.

    Aaron has been using Cowbird to help the Pine Ridge community tell their own stories directly, and starting today, you can find hundreds of their own, unedited voices on National Geographic’s website, as an embedded Cowbird collection.

    We hope this collaboration demonstrates a model that many other journalists and news organizations will follow for many other communities — using Cowbird as a way to give those communities a voice alongside the “official” account of their story. If you’re interested in piloting a project with us, please email us at hello@cowbird.com.

    You can browse the voices of Pine Ridge here — enjoy!

     
  5. 10:35

    Notes: 405

    Reblogged from thekidshouldseethis

    thekidshouldseethis:

    A downward lightning negative ground flash captured at 7,207 images per second. A negative stepped leader emerges from the cloud and connects with the ground forming a return stroke.

    From ZT Research, who is “trying to figure out how lightning works.”

    via Stellar.

     
  6. Confronting High-Frequency Trading: David vs. Goliath’s Evil Twin

    In my first blog post, I discussed the skewed incentives, unfair playing field, and dangerous consequences of high-frequency trading (HFT). Is it any wonder that you could apply the same adjectives to the financial services industry in general? HFT is a destructive force in the markets: It increases the cost of trading for most people, rewards an activity that provides little to no economic value, and damages investor confidence in the fairness of equity markets. It is, however, symptomatic of much bigger and more deeply seated problems in the industry. 

    Before we tackle these encompassing problems, we should consider several actions that legislators could take to help dampen or eliminate the ill effects of HFT:

    • Eliminate for-profit exchanges: The experiment in for-profit exchanges has been an utter disaster. The conversion of NYSE and NASDAQ from privately held, self-regulatory organizations to publicly traded ones has not worked. It is ironic that the financial services industry should be such a great example of capitalism’s limitations. Stock exchanges should serve the general population by ensuring fair and orderly markets. Likewise, they should serve the companies that are listed or who want to list by ensuring robust price discovery and capital formation. This is the first step to instituting any reform. Our exchanges must take actions to reduce their earnings per share to ensure a more stable market and economy. That is a tradeoff most public companies are not willing to make.
    • Mandate that liquidity providers fulfill obligations in order to collect rebates: If you’re willing to take the other side of a buy/sell trade (in other words, you provide liquidity, often called “market making”), you usually receive a rebate from the exchange instead of paying a fee. Electronic liquidity providers perform a critical function and should be compensated for maintaining orderly markets. They take on inventory risk and that can be costly. However, historically there have always been obligations for market makers to continue to provide liquidity even during market turbulence. This is an obligation to which HFT firms are not currently beholden. This must change. The ability to collect liquidity-providing rebates should only be available to those firms willing to stay in the market when conditions become difficult.
    • Level the data access playing field:
      • Eliminate co-location: For the firm willing and able to pay for it, data center space is available for rental right next to the stock exchange. The closer you are, the faster you receive data. This, of course, only benefits HFT firms (and the earnings per share of for-profit exchanges), providing their servers information milliseconds before the rest of the market. It can be argued that this makes it non-public information. To many of these modern servers, milliseconds are an eternity. When I worked in the industry, I built models that would take 40 microseconds (40 millionths of a second) from when data hit the network card to when the trade was sent out to the market. I have no doubt there are faster systems out there now, and those reaction times will only continue to decrease.
      • Eliminate direct market data feeds: These are the feeds coming directly from the exchange providing updates every time something in the market changes. These are only useful if you have tremendous computing power to process upwards of 2 million events per second. (Unless, of course, you can do this in your head, in which case, good for you!). The Foreign Exchange (FX) markets have long worked well under a different model, and I’m not convinced that anyone would suffer from less data. If market data feeds simply sent out a snapshot of the full order book every 100 milliseconds, who would be worse off?
    • Reinstate the uptick rule: For almost 70 years if you wanted to sell a stock short (i.e., wager that it would go down in value) you had to wait until that stock’s price moved up. A change to this simple rule in 2007 opened the door for many high-frequency traders. If you are a student of history, you know that cyclically the world looks very similar to the time that rule was first passed (1938). Maybe it’s time we reconsider? A long list of financial services experts (which notably does not include any HFT firms) agree with me on this one.

    Unfortunately, without a substantial public uproar, I’m inclined to think the odds of any meaningful financial reform are very slim:

    • It’s already 2.5 years since the SEC decided, and most of the financial services world agreed that flash orders should be banned, and they have not been.
    • Regardless of how many billions of dollars JPMorgan’s prop trading desk loses, they (and others in the industry) continue to argue that institutions with federally insured deposits should still be able to “hedge” by making extremely large bets that look amazingly similar to the proprietary bets that the Volcker Rule is supposed to eliminate.
    • The financial reform act negotiated by Congress was already dramatically weakened by the time it was passed, and the industry has deployed its band of lobbyists to weaken it further as the legislation is turned into actionable regulation.

    Barry Ritholz gives some excellent suggestions in his most recent column on ending TBTF, while acknowledging that “[a]fter years of deregulation, it has become all but impossible to re-regulate modern banking. There was a brief window during the credit crisis, but that has passed. Today, profits trump soundness. Safety and security are secondary to risk-taking and speculation.”

    I couldn’t have said it better myself. We have no shortage of great ideas for how to fix the system. We simply lack the will and the ability to push meaningful reform through our corporate-controlled, hyper-partisan government.

     
  7. It Ain’t Rocket Surgery: The Skewed Incentives and Dangerous Consequences of High-Frequency Trading

    During my appearance on NPR’s Marketplace last week, I discussed why I left the high-frequency trading (HFT) business. When I learned that I was about to become a father, I was forced to re-examine many things in my life, not the least of which was a profession that added little value to the world. At 30 years old, I would regularly be trading positions worth millions of dollars, an extremely stressful daily undertaking. I began to fear that I was spending too much of my life’s stress budget on something that quite honestly, was helping to wreak tremendous economic havoc. Ultimately I decided to walk away from an extremely lucrative profession.

    HFT is a subset of a part of the financial services industry called quantitative analysis. Researchers are often referred to as quants, and there’s a popular saying amongst quants: “It Ain’t Rocket Surgery.” This conflates “rocket science” and “brain surgery” to come up with a hybrid term that is supposed to poke fun at the fact that while what they do is very hard, it wasn’t as hard as either of those.

    Lately, I’ve thought about this saying in a different way. The most successful quants make huge amounts of money — annual bonuses in the millions of dollars. Quants who have proven themselves successful (or who just sound really smart and interview really well) can command base salaries very similar to top neurosurgeons, and without the 15 to 20 years it takes to advance in said profession. While working at one of the largest hedge funds in the world, I heard the story of a quant manager who was so offended by his meager $85 million bonus that he left the company. I’m not sure there’s an aerospace engineer or neurosurgeon or even a Major League Baseball player in the world that wouldn’t happily take this bonus.

    I wouldn’t argue that compensation in HFT is dramatically different than in the rest of financial services, but it is certainly among the most extreme in the industry.

    This would all be well and good if high-frequency traders contributed some value to society at large. Examining the value proposition (or lack thereof) of HFT would take many blog posts, or an entire book (which was just published, Broken Markets by Sal Arnuk and Joe Saluzzi). There is a case to be made that rather than providing liquidity and increasing market efficiency, HFT is doing the opposite (Nanex has produced some excellent research on this topic).

    I have personally seen how HFT is increasing market volatility, creating events such as the Flash Crash of May 2010. I was on the trading floor when the market disappeared that day, and stopped functioning. While the standard explanation of the flash crash is an order that was too big for the market to handle, it had seemed inevitable that Chaos Theory and the nonlinear implications of sensitive dependence on initial conditions would lead to a destructive positive feedback loop. In other words, a butterfly flapped its wings somewhere in Delaware, and the entire U.S. equity market fell apart for a few minutes. Nothing has changed since then, and individual securities demonstrate this behavior on a daily basis.

    There is no doubt that the fallout of the Flash Crash is that retail investor confidence has been undermined, and small investors have been fleeing equity markets even as the economy flattens out and the market rises. According to the Wall St Journal, “$370 billion has been withdrawn from U.S. stock funds by small investors” since this event. Some argue that equity outflows are a result of the worst financial crisis since the Great Depression. However, the stock market has risen by almost 19 percent in the two years following the Flash Crash (though there was a month — from July to August 2010 — during which the market dropped more than 16 percent). With this kind of volatility, is it any wonder that retail investors are fleeing from a market that they no longer understand?

    There are much graver implications to our economy from increased market volatility and decreased investor confidence. Higher volatility creates a negative wealth effect. Retail investors tend to time the market poorly, buying when stocks are too high and selling when they are too low. In addition, higher correlation amongst stocks means that traditionally unrelated companies and sectors move in a more concerted fashion. This eliminates the benefits of diversification, an idea that has been championed to retail investors for decades. As retail investors pull out of the market, the bedrock of the greatest capital formation and price discovery mechanism ever created is disappearing, leaving behind statistically-driven trading systems trying to game each other to shave pennies here and there. This is destroying the U.S. economy’s ability to support and grow new businesses.

    I entered the world of finance for many reasons. I grew up during the dot-com bubble, so I thought I knew how to pick stocks. The compensation was extremely high. There’s also huge appeal to “the game,” the relentless judgment passed by the market on a daily basis demonstrating how right or wrong you are. But at my core, I have philosophically been a capitalist my entire life. As I witnessed the destruction of capital markets, driven by crony capitalism and the ambition of a small wealthy population to become yet wealthier, I could no longer sit idly by and watch it happen. Something must be done to dramatically shake up the financial services industry, and while regulations are being written and manipulated by lobbyists, that will not happen. In my next post, I’ll talk about some possible solutions to this problem, and why we have little hope of seeing them passed without several more crashes, flash or otherwise.

     
  8. Plays: 0

    My Appearance on NPR’s Marketplace

    Like many people in this country, I grew up in a world that equated success with money. I read Ayn Rand and subscribed to her idea that your worth to society is perfectly measured by the amount of money you earn. I followed the path of least resistance and with degrees in computer science and finance, I wound up in high-frequency trading in 2009.

    I thought I was happy. While the economy was falling apart, I was getting rich and so was everyone around me. Then last spring, the test came back positive — my wife was pregnant. Suddenly I found myself thinking how I would explain my job to my future daughter. What exactly did I do to make so much money? What exactly was I doing to add value to the world? After two years in high-frequency trading, I could see how little value it actually created. And I saw first-hand it was nowhere near enough to offset the damage it wrought — where computers create whipsaw volatility and fortunes are made and lost in milliseconds.

    I knew in my heart that I was nothing more than a leech, regardless of how impressive my trading algorithms were. The brain drain that sucked so many smart and talented people away from noble pursuits and into financial services to earn their fortunes made me question the very foundation of my economic assumptions. I decided to leave the industry and find a new path. My passion for start-ups had been with me my whole life. I found myself craving the challenge of creating something new and the rush of excitement when something you’ve built resonates with other people. I knew it would be difficult to quit my job in a terrible economic environment, start a new company, and have my first child. But in my heart I knew it was what I wanted. When my daughter was old enough, I wanted be proud to tell her what I’ve done, and even prouder to tell her that no matter what, she must follow her passion in life like her father did.

     
  9. Tumblr OAuth Test Post

    This is a test post

    This was posted from my sandbox via OAuth!