Neural memories based on distributed representations within neural networks offer the ability of robust retrieval in the presence of noisy, partial and approximate inputs. Neural memories offer an alternate learning paradigm for content-based storage and associative retrieval, by extracting patterns from the inputs and creating distributed representations that can capture the essential features of the data. In particular, our focus is on storing and retrieving information from text and knowledge graphs, representing the inter-relationships between entities directly within the neural models, to facilitate associative retrieval. Such memories offer the promise of more effective search for information and connections than possible today over multi-modal data spanning text, images, and knowledge graphs.
We are currently developing neural memories and bio-learning for creating novel text embeddings.