Hexagons and Geospatial Encoding
bees figured this out ages ago
If you’re like me, you’ve always had a strange feeling that bees (much like dolphins) know more than they’re letting on.
They know architecture, and communicate navigational information through dance. Their elaborate caste-based hierarchy and collective survival impetus is striking. There is something mathematical about them, and that is deeply unsettling.
I’ve recently been looking into encoding data to binary representations for use in neuromorphic AI systems. I’ve always been fascinated with mathematical and artistic representations of space, distance and volume.
Searching for the intersection of these ideas has led me to the same destination as bees: the hexagon.
Let me make this quite clear: we are only starting to learn the extent to which honey-gathering insects know more than us. Bees build their hives with tessellating hexagons. What do they know?
A Hex Upon Us All
Evolution drives efficiency like a function approaching a limit. Consider bees building a honeycomb: they use wax (derived from honey) to build a structure to store honey. What’s efficient in this scenario?
- Maximizing honey storage capacity
- Minimizing wax use
- Ensuring stability
If you had to stack equally sized square boxes to take up the least amount of space, you’d line them up nicely to minimize wasted space in between — this is tessellation. But it wouldn’t work with octagonal boxes.
Only three shapes can tessellate without space in between: equilateral triangles, squares and hexagons. Why have bees chosen the latter?
I’m far from the first to eye our small yellow-striped neighbors with suspicion. 2,000 years ago, the great Roman scholar Marcus Terentius Varro proposed a “Honeycomb Conjecture”.
He reasoned that hexagonal hives are more compact, generally speaking, than other assortments, and had overall smaller perimeters. Thomas Hales formally proved this in 1999.
Bees that made inefficiently shaped hives stored less honey, which also had to be used to make more wax (for the same relative size). Colonies with the most efficient architecture outcompeted their rivals across the globe.
The image above demonstrates a this as an intuitive hypothesis: incremental improvements over time lead from circular comb-arrangement to hexagons. This ensures structural stability by connecting neighboring cells in a unified arrangement.
Krulwich at NPR also provides an interesting analysis of labor efficiency: with tessellating cell shapes, many workers can assemble and connect cells concurrently without need for organizational bottlenecks wasting valuable time.
Welcome to the Grid
This efficiency has created a world where entire ecosystems hinge upon their pollination. We are reliant on these insects for our very sustenance. I don’t trust these buzzing busybodies, and after you hear about grid cells, neither will you.
Back in 2010 some scientists hooked up an electrode to a single neuron in a rat’s entorhinal cortex, and let it run around a small enclosure:
As the mouse explores this new environment, the neuron (termed a grid cell) occasionally activates, and the electrode records each location where the cell spiked. It initially seems random, but over the course of a minute, a hexagonal grid begins to emerge.
These grid cells are likely playing a huge role in our brains’ spatial memory and navigation. The trick is that we have far more than one grid cell all working in tandem.
There’s a great video from Numenta explaining why multiple cells can collaborate to increase precision of knowing where you are.
Think about the rat video again, and consider that other grid cells are firing ‘in between’ the recorded cell’s hexagonal grid. This series of overlapping, interconnected grids allows us to “map out” our environment.
And spatial navigation is crucial in terms of evolutionary competition. Our nomadic ancestors crossed vast swathes of wilderness to hunt, and had to navigate their way home without the aid of GPS.
Evolution drives efficiency, and seems to have driven our spatial memory cortex towards a hexagonal grid structure. Sound familiar?
HTM’s Grid Cell Encoder
Bees use hexagonal grids to maximize volume efficiency. Our brains use hexgrids for a similar efficiency-maximizing purpose — storing as much navigational ability in our limited amount of grey matter.
If it’s such an efficient pattern, the next question to ask is:
How can we get our AI to think like this?
I’ve been lauding the strengths of Hierarchical Temporal Memory neural networks for a while now, and questioning how we can convert more data types into the same binary representations our brain processes.
The next binary horizon may be geospatial data: coordinate grid mapping to binary vectors for use in all sorts of predictive AI. Fortunately for us, solid groundwork has already been laid.
Some fellows at HTM.core’s open-source repository put together a grid cell encoder: a module that mimics the function of entorhinal grid neurons to convert 2D coordinates to Sparse Distributed Representations. I spun it up to see what the demonstration code can offer.
We create a
GridCellEncoder and use a pre-defined
arena_size (100 in this example) to convert a bunch of coordinates to SDRs.
When we show the stacked cell SDRs, we have multiple receptive fields — or variations of cell density — imposed upon the same room.
Keep in mind that the purpose of such an encoder is to convert 2D coordinates (latitude and longitude, for example) to SDRs for use in binary processing in an HTM network.
The tricky part of encoding is ensuring that similar input data results in a similar enough SDR such that bit overlap effectively represents semantic similarity.
The grid cell encoder actually seems to accomplish this quite well, which makes me quite excited at the possibilities for scalar-coordinate encoding. HTM nets have the most application with time-series data “in motion”, so geospatial tracking involving shipping or ecological travel would see the most application.
Perhaps more importantly, it does so using hexagons, following the core rule of “follow the form to yield the function”. If we want to catch up to bees, we’re going to need powerful AI to help us. This is a solid step forward on this journey.