Details

    • Type: Bug
    • Status: Done
    • Resolution: Done
    • Affects Version/s: BIGDATA_RELEASE_1_3_0
    • Fix Version/s: None
    • Component/s: blueprints

      Activity

      Hide
      bryanthompson bryanthompson added a comment -

      Mike, once you have the initial implementation let's schedule some time to discuss the RDF GOM implementation and whether it can provide a more scalable approach to the blueprints API. The primary benefits would be:


      - the ability to materialized part of the graph on the client
      - the ability to page through the link sets of the graph automatically
      - the ability to use SPARQL to efficient materialize a (sub-)graph on the client.

      Bryan

      Show
      bryanthompson bryanthompson added a comment - Mike, once you have the initial implementation let's schedule some time to discuss the RDF GOM implementation and whether it can provide a more scalable approach to the blueprints API. The primary benefits would be: - the ability to materialized part of the graph on the client - the ability to page through the link sets of the graph automatically - the ability to use SPARQL to efficient materialize a (sub-)graph on the client. Bryan
      Hide
      mikepersonick mikepersonick added a comment -

      Very close to having a first pass of this completely working. Trying to make it incredibly easy to download bigdata and start using it through the Gremlin console and Blueprints API. Here is what the startup will look like:

      Get Bigdata and Start Server:
      > svn checkout svn://svn.code.sf.net/p/bigdata/code/branches/BIGDATA_RELEASE_1_3_0 bigdata
      > cd bigdata
      > ant start
      
      Get a Sample GraphML (Tinkerpop Property Graph):
      > curl -O https://raw.githubusercontent.com/tinkerpop/gremlin/master/data/graph-example-1.xml
      
      Post it to Bigdata:
      > curl --upload-file graph-example-1.xml -H "content-type:application/graphml+xml" -X POST http://localhost:9999/bigdata/sparql?blueprints
      
      Install and Start Gremlin Console:
      > cd bigdata
      > ant gremlin
      > ./ant-build/gremlin-groovy-2.5.0/bin/gremlin.sh
               \,,,/
               (o o)
      -----oOOo-(_)-oOOo-----
      gremlin>
      
      Create a Client Graph in the Gremlin Console:
      gremlin> g = new com.bigdata.blueprints.BigdataGraphClient("http://localhost:9999/bigdata/sparql")
      gremlin> g.V
      gremlin> g.E
      gremlin> g.shutdown()
      

      Just need to clean up and document and then it's ready to go.

      Show
      mikepersonick mikepersonick added a comment - Very close to having a first pass of this completely working. Trying to make it incredibly easy to download bigdata and start using it through the Gremlin console and Blueprints API. Here is what the startup will look like: Get Bigdata and Start Server: > svn checkout svn://svn.code.sf.net/p/bigdata/code/branches/BIGDATA_RELEASE_1_3_0 bigdata > cd bigdata > ant start Get a Sample GraphML (Tinkerpop Property Graph): > curl -O https://raw.githubusercontent.com/tinkerpop/gremlin/master/data/graph-example-1.xml Post it to Bigdata: > curl --upload-file graph-example-1.xml -H "content-type:application/graphml+xml" -X POST http://localhost:9999/bigdata/sparql?blueprints Install and Start Gremlin Console: > cd bigdata > ant gremlin > ./ant-build/gremlin-groovy-2.5.0/bin/gremlin.sh \,,,/ (o o) -----oOOo-(_)-oOOo----- gremlin> Create a Client Graph in the Gremlin Console: gremlin> g = new com.bigdata.blueprints.BigdataGraphClient("http://localhost:9999/bigdata/sparql") gremlin> g.V gremlin> g.E gremlin> g.shutdown() Just need to clean up and document and then it's ready to go.
      Hide
      mikepersonick mikepersonick added a comment -

      I've committed a Blueprints/Gremlin integration. It includes lots of tools to make it very easy to get started with bigdata as a graph database with the Blueprints/Gremlin APIs.

      1. Go get bigdata and start the server:

      > svn checkout svn://svn.code.sf.net/p/bigdata/code/branches/BIGDATA_RELEASE_1_3_0 bigdata
      > cd bigdata
      > ant start-bigdata
      

      2. Go get the Tinkerpop Property Graph (sample GraphML data) and POST it to bigdata:

      > curl -O https://raw.githubusercontent.com/tinkerpop/gremlin/master/data/graph-example-1.xml
      > curl --upload-file graph-example-1.xml -H "content-type:application/graphml+xml" -X POST http://localhost:9999/bigdata/sparql?blueprints
      

      3. Run an ant task to download, unpack, and configure the Gremlin console to work with bigdata:

      From the bigdata directory:
      > ant gremlin
      

      4. Start Gremlin:

      From the bigdata directory:
      > ./ant-build/gremlin-groovy-2.5.0/bin/gremlin.sh
               \,,,/
               (o o)
      -----oOOo-(_)-oOOo-----
      gremlin>
      

      5. From Gremlin (or Blueprints code), you can connect to the bigdata server, or create a local instance (either in-memory or persistent):

      gremlin> import com.bigdata.blueprints.*
      gremlin> remoteGraph = BigdataGraphFactory.connect("http://localhost:9999/bigdata")
      gremlin> inMemGraph = BigdataGraphFactory.create()
      gremlin> persistentGraph = BigdataGraphFactory.create("/tmp/bigdata.jnl")
      

      6. If you've created a persistent local graph instance, you can easily re-open it later:

      gremlin> persistentGraph = BigdataGraphFactory.open("/tmp/bigdata.jnl")
      
      Show
      mikepersonick mikepersonick added a comment - I've committed a Blueprints/Gremlin integration. It includes lots of tools to make it very easy to get started with bigdata as a graph database with the Blueprints/Gremlin APIs. 1. Go get bigdata and start the server: > svn checkout svn://svn.code.sf.net/p/bigdata/code/branches/BIGDATA_RELEASE_1_3_0 bigdata > cd bigdata > ant start-bigdata 2. Go get the Tinkerpop Property Graph (sample GraphML data) and POST it to bigdata: > curl -O https://raw.githubusercontent.com/tinkerpop/gremlin/master/data/graph-example-1.xml > curl --upload-file graph-example-1.xml -H "content-type:application/graphml+xml" -X POST http://localhost:9999/bigdata/sparql?blueprints 3. Run an ant task to download, unpack, and configure the Gremlin console to work with bigdata: From the bigdata directory: > ant gremlin 4. Start Gremlin: From the bigdata directory: > ./ant-build/gremlin-groovy-2.5.0/bin/gremlin.sh \,,,/ (o o) -----oOOo-(_)-oOOo----- gremlin> 5. From Gremlin (or Blueprints code), you can connect to the bigdata server, or create a local instance (either in-memory or persistent): gremlin> import com.bigdata.blueprints.* gremlin> remoteGraph = BigdataGraphFactory.connect("http://localhost:9999/bigdata") gremlin> inMemGraph = BigdataGraphFactory.create() gremlin> persistentGraph = BigdataGraphFactory.create("/tmp/bigdata.jnl") 6. If you've created a persistent local graph instance, you can easily re-open it later: gremlin> persistentGraph = BigdataGraphFactory.open("/tmp/bigdata.jnl")
      Hide
      mikepersonick mikepersonick added a comment -

      The basic implementation of this is done:

      -Abstract BigdataGraph plus supported classes (BigdataVertex, BigdataElement, BigdataEdge)
      -BigdataGraphEmbedded that wraps an embedded bigdata instance (transactional)
      -BigdataGraphClient the wraps a client/server bigdata instance (non-transactional)
      -BigdataGraphQuery wrapper that converts Blueprints GraphQuery object to SPARQL
      -Gremlin integration

      Future work (and tickets) might include:

      -GOM integration
      -Rexter integration
      -Higher-performance caching implementations

      Show
      mikepersonick mikepersonick added a comment - The basic implementation of this is done: -Abstract BigdataGraph plus supported classes (BigdataVertex, BigdataElement, BigdataEdge) -BigdataGraphEmbedded that wraps an embedded bigdata instance (transactional) -BigdataGraphClient the wraps a client/server bigdata instance (non-transactional) -BigdataGraphQuery wrapper that converts Blueprints GraphQuery object to SPARQL -Gremlin integration Future work (and tickets) might include: -GOM integration -Rexter integration -Higher-performance caching implementations

        People

        • Assignee:
          mikepersonick mikepersonick
          Reporter:
          bryanthompson bryanthompson
        • Votes:
          0 Vote for this issue
          Watchers:
          2 Start watching this issue

          Dates

          • Created:
            Updated:
            Resolved: