Future Flash Crashes, Digital Darwinism & the Resurgence of Hardware

Category: Rabbit Hole

As part of the ET 2.0 expanded sandbox, I’ve asked Neville Crawley to write a weekly-ish “Down the Rabbit Hole” column with his observations on what he calls Big Compute, I call non-human intelligences, and the rest of the world calls AI. This is the biggest revolution in markets and the world today.

Neville will be publishing under his own byline in the near future — his commentary continues below.



Future flash crashes

Remember a few years back when a bogus AP tweet instantly wiped $100bn off the US markets? In April 2013 the Associated Press’ Twitter account was compromised by hackers who tweeted “Breaking: Two Explosions in the White House and Barack Obama is injured.”

For illustrative purposes only.

Source: The Washington Post, 04/23/13, Bloomberg L.P., 04/23/13.

The tweet was quickly confirmed to be an alternative fact (as we say in 2017), but not before the Dow dropped 145 points (1%) in two minutes.

Well, my view is that we are heading into a far more ‘interesting’ era of flash crashes of confused, or deliberately misled, algorithms. In this concise paper titled “Deceiving Google’s Cloud Video Intelligence API Built for Summarizing Videos”, researchers from the University of Washington demonstrate that by inserting still images of a plate of noodles (amongst other things) into an unrelated video, they could trick a Google image-recognition algorithm into thinking the video was about a completely different topic.

Digital Darwinism

I’m not sure I totally buy the asserted causality on this one, but the headline story is just irresistible: “Music Streaming Is Making Songs Faster as Artists Compete for Attention.” Paper abstract:

Technological changes in the last 30 years have influenced the way we consume music, not only granting immediate access to a much larger collection of songs than ever before, but also allowing us to instantly skip songs. This new reality can be explained in terms of attention economy, which posits that attention is the currency of the information age, since it is both scarce and valuable. The purpose of these two studies is to examine whether popular music compositional practices have changed in the last 30 years in a way that is consistent with attention economy principles. In the first study, 303 U.S. top-10 singles from 1986 to 2015 were analyzed according to five parameters: number of words in title, main tempo, time before the voice enters, time before the title is mentioned, and self-focus in lyrical content. The results revealed that popular music has been changing in a way that favors attention grabbing, consistent with attention economy principles. In the second study, 60 popular songs from 2015 were paired with 60 less popular songs from the same artists. The same parameters were evaluated. The data were not consistent with any of the hypotheses regarding the relationship between attention economy principles within a comparison of popular and less popular music.

Meanwhile, in other evolutionary news, apparently robots have been ‘mating’ and evolving in an evo-devo stylee. DTR? More formal translation: Researchers have added complexity to the field of evolutionary robotics by demonstrating for the first time that, just like in biological evolution, embodied robot evolution is impacted by epigenetic factors. Original Frontiers in Robotics and AI (dense!) paper here. Helpful explainer article here.

The resurgence of hardware

As we move from a Big Data paradigm of commoditized and cheap AWS storage to a Big Compute ­­paradigm of high performance chips (and other non-silicon compute methods), we are discovering step-change innovation in applied processing power driven by the Darwinian force of specialization, or, as Chris Dixon recently succinctly tweeted: “Next stage of Moore’s Law: less about transistor density, more about specialized chips.”

We are seeing the big guys like Google develop their specialized chips custom-made for their specific big compute needs, with a very significant increase of speed of up to 30 times faster than today’s conventional processors and using much less power, too.

Also, we are seeing increased real-world applications being developed for truly evolutionary-leap technologies like quantum computing. MIT Technology Review article on implementing the powerful Grover’s quantum search algorithm here.

And, finally, because it just wouldn’t be a week in big compute-land without a machine beating a talented group of humans at one game of another: Poker-Playing Engineers Take on AI Machine – And Get Thrashed.

Key points:

  1. People have a misunderstanding of what computers and people are each good at. People think that bluffing is very human, but it turns out that’s not true. A computer can learn from experience that if it has a weak hand and it bluffs, it can make more money.
  2. The AI didn’t learn to bluff from mimicking successful human poker players, but from game theory. Its strategies were computed from just the rules of the game, not from analyzing historical data.
  3. Also evident was the relentless decline in price and increase in performance of running advanced ‘big compute’ applications; the computing power used for this poker win can be had for under $20k.

epsilon-theory-rabbit-hole-ben-hunt-may-4-2017.pdf (296 KB)