<div dir="ltr"><div><p>(sorry, this time again but with the links OK.)<br></p><p>After a while, I've came up with this mix. I face up sources
transformation
using RDF as an underlying unifying model of, for example and not
limited to: Tabular, XML, JSON, and even OLAP data sources as input.
Then, perform an 'ETL' and inference in a Loader layer where I can infer
types an so and then populate a semantic/semiotic graph. The idea is
that the graph is
flexible enough to be viewed as any of the APIs mentioned in the
document (Tabular, Neo4J, XML, JSON, etc). Any of this APIs are to be
implemented in an ad-hoc manner so there is no limit if you need another
format. I try to explain the benefits of doing things this way in the
document, like analysis, mining and drill. Apologize if I'm not clear
enough or even totally wrong with this, I wrote this not having my
medications at hand...</p><p>really, the first link is a very summarized
readme of what is being to be built. the other link is the Google Code
project where the sources are being hosted, check the 'Cognescent'
folder for watching an implementation of the semiotic graph</p><a href="https://drive.google.com/file/d/0BxxuOINjaiBNRER3c3d3NnBaVWs/edit?usp=sharing" target="_blank">https://drive.google.com/file/d/0BxxuOINjaiBNRER3c3d3NnBaVWs/edit?usp=sharing</a><br>
<br></div><a href="http://cognescent.googlecode.com/">http://cognescent.googlecode.com/</a><br><div><br>Best Regards,<br>Sebastian Samaruga.</div></div>