<div dir="ltr"><p>After a while, I've came up with this mix. I face up sources transformation
using RDF as an underlying unifying model of, for example and not
limited to: Tabular, XML, JSON, and even OLAP data sources as input.
Then, perform an 'ETL' and inference in a Loader layer where I can infer
types an so and then populate a semantic/semiotic graph. The idea is that the graph is
flexible enough to be viewed as any of the APIs mentioned in the
document (Tabular, Neo4J, XML, JSON, etc). Any of this APIs are to be
implemented in an ad-hoc manner so there is no limit if you need another
format. I try to explain the benefits of doing things this way in the
document, like analysis, mining and drill. Apologize if I'm not clear enough or even totally wrong with this, I wrote this not having my medications at hand...</p><p>Readme doc:<br><a href="http://cognescent.googlecode.com">https://drive.google.com/file/d/0BxxuOINjaiBNRER3c3d3NnBaVWs/edit?usp=sharing<br>
http://cognescent.googlecode.com</a><br><br>Regards,<br>Sebastian Samaruga.<br></p></div>