Neurological Time Series/Anomaly Detection: Hierarchical Temporal Memory

predicting power consumption with a ‘closer to biology’ neural network

3D imaged & colored section of hippocampus: University of Hong Kong

Hitting the Gym

Binary Encodings

dateEncoder = DateEncoder(
timeOfDay = (30,1) # DateTime is a composite variable
weekend = 21 # how many bits to allocate to each part
scalarEncoderParams = RDSE_Parameters() # encoding a continuous var
scalarEncoderParams.size = 700 # SDR size
scalarEncoderParams.sparsity = 0.02 # 2% sparsity magic number
scalarEncoderParams.resolution = 0.88
scalarEncoder = RDSE(scalarEncoderParams) # 'random distributed scalar encoder'
encodingWidth = (dateEncoder.size + scalarEncoder.size)
enc_info = Metrics( [encodingWidth], 999999999) # performance metrics storage obj

A Dip in the Pool

so it’s Battleship, kinda
sp = SpatialPooler(
inputDimensions = (encodingWidth,),
columnDimensions = (spParams["columnCount"],), # 1638
potentialPct = spParams["potentialPct"], # 0.85
potentialRadius = encodingWidth,
globalInhibition = True,
localAreaDensity = spParams["localAreaDensity"], # .04
synPermInactiveDec = spParams["synPermInactiveDec"], # .006
synPermActiveInc = spParams["synPermActiveInc"], # 0.04
synPermConnected = spParams["synPermConnected"], # 0.13
boostStrength = spParams["boostStrength"], # 3
wrapAround = True
)
sp_info = Metrics(sp.getColumnDimensions(), 999999999)

Walking Down Memory Lane

source
tm = TemporalMemory(
columnDimensions = (spParams["columnCount"],),
cellsPerColumn = tmParams["cellsPerColumn"], # 13
activationThreshold = tmParams["activationThreshold"], # 17
initialPermanence = tmParams["initialPerm"], # 0.21
connectedPermanence = spParams["synPermConnected"],
minThreshold = tmParams["minThreshold"], # 19
maxNewSynapseCount = tmParams["newSynapseCount"], # 32
permanenceIncrement = tmParams["permanenceInc"], # 0.1
permanenceDecrement = tmParams["permanenceDec"], # 0.1
predictedSegmentDecrement = 0.0,
maxSegmentsPerCell = tmParams["maxSegmentsPerCell"], # 128
maxSynapsesPerSegment = tmParams["maxSynapsesPerSegment"] # 64
)
tm_info = Metrics( [tm.numberOfCells()], 999999999)

Time to Train

predictor = Predictor(steps=[1,5], alpha=0.1) # continuous output predictor
predictor_resolution = 1
inputs = [] # create inp/out lists
anomaly = []
anomalyProb = []
predictions = {1: [], 5:[]}
predictor.reset() # reset the predictor
for count, record in enumerate(records): # iterate through data
dateString = datetime.datetime.strptime(record[0], "%m/%d/%y %H:%M") # unstring timestamp
consumption = float(record[1]) # unstring power value
inputs.append(consumption) # add power to inputs


# use encoder: create SDRs for each input value
dateBits = dateEncoder.encode(dateString)
consumptionBits = scalarEncoder.encode(consumption)

# concatenate these encoded_SDRs into a larger one for pooling
encoding = SDR(encodingWidth).concatenate([consumptionBits, dateBits])
enc_info.addData(encoding) # enc_info is our metrics to keep track of how the encoder fares


# create SDR to represent active columns. it'll be populated by .compute()
# notably, this activeColumns SDR has same dimensions as spatial pooler
activeColumns = SDR(sp.getColumnDimensions())


# throw the input into the spatial pool and hope it swims
sp.compute(encoding, True, activeColumns) # we're training, so learn=True
tm_info.addData(tm.getActiveCells().flatten())


# pass pooled SDR through temporal memory
tm.compute(activeColumns, learn=True)

# make prediction based on current input & memory-context
pdf = predictor.infer( tm.getActiveCells() )
for n in (1,5):
if pdf[n]:
predictions[n].append( np.argmax( pdf[n] ) * predictor_resolution )
else:
predictions[n].append(float('nan'))

anomalyLikelihood = anomaly_history.anomalyProbability( consumption, tm.anomaly )
anomaly.append(tm.anomaly)
anomalyProb.append(anomalyLikelihood)

# reinforce output connections
predictor.learn(count, tm.getActiveCells(), int(consumption/predictor_resolution))

Results

{1: 0.07548016042172133, 5: 0.0010324285729320193} # RMSE
power_consumption:
min: 10
max: 90.9
mean: 31.3
Anomaly Y-axis: normalized power units. I love seaborn

Good job, brain-model

data scientist, machine learning engineer. passionate about ecology, biotech and AI. https://www.linkedin.com/in/mark-s-cleverley/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store