Details

    • Type: Bug
    • Status: Done
    • Resolution: Done
    • Affects Version/s: TERMS_REFACTOR_BRANCH
    • Fix Version/s: None
    • Component/s: HTree

      Description

      This might be related to copyOnWrite or it might be an OOM issue. Try with query0021 with native distinct enabled in AST2BOpContext and AST2BOpBase (one place in each file) and use this query hint in query0021 to enable a hash join:

                              OPTIONAL {
                                      ?_var10 p2:votedBy ?_var3.
                                      ?_var10 rdfs:label ?_var2.
                                      hint:BGP hint:com.bigdata.rdf.sparql.ast.eval.hashJoin "true" .
                              }
      
      WARN : Haltable.java:418: com.bigdata.util.concurrent.Haltable@4ecd1d6e : isFirstCause=true : java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-122441687883571187
      java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-122441687883571187
      	at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
      	at java.util.concurrent.FutureTask.get(FutureTask.java:83)
      	at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1197)
      	at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:758)
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
      	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
      	at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63)
      	at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:664)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
      	at java.lang.Thread.run(Thread.java:662)
      Caused by: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-122441687883571187
      	at com.bigdata.htree.DirectoryPage.replaceChildRef(DirectoryPage.java:1778)
      	at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:712)
      	at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:702)
      	at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:583)
      	at com.bigdata.htree.HTree.insert(HTree.java:1325)
      	at com.bigdata.htree.HTree.insert(HTree.java:1246)
      	at com.bigdata.bop.join.HTreeHashJoinUtility.saveInJoinSet(HTreeHashJoinUtility.java:1215)
      	at com.bigdata.bop.join.HTreeHashJoinUtility.hashJoin(HTreeHashJoinUtility.java:1137)
      	at com.bigdata.bop.join.HTreeSolutionSetHashJoinOp$ChunkTask.doHashJoin(HTreeSolutionSetHashJoinOp.java:304)
      	at com.bigdata.bop.join.HTreeSolutionSetHashJoinOp$ChunkTask.call(HTreeSolutionSetHashJoinOp.java:256)
      	at com.bigdata.bop.join.HTreeSolutionSetHashJoinOp$ChunkTask.call(HTreeSolutionSetHashJoinOp.java:1)
      	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
      	at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1196)
      	... 9 more
      

        Activity

        Hide
        bryanthompson bryanthompson added a comment -

        In fact, the following all need to be turned on to work this issue:

        AST2BOpContext:

            boolean nativeDistinct = true;
            boolean nativeHashJoins = true;
        

        AST2BOpBase:

            protected static boolean nativeDefaultGraph = true;
        
        Show
        bryanthompson bryanthompson added a comment - In fact, the following all need to be turned on to work this issue: AST2BOpContext: boolean nativeDistinct = true; boolean nativeHashJoins = true; AST2BOpBase: protected static boolean nativeDefaultGraph = true;
        Hide
        bryanthompson bryanthompson added a comment -

        Unfortunately, I am not able to replicate this against r5558. Instead, I get:

             [java] resultCount=19982, elapsed=285879ms, source=queries/query0021.rq
        
        Show
        bryanthompson bryanthompson added a comment - Unfortunately, I am not able to replicate this against r5558. Instead, I get: [java] resultCount=19982, elapsed=285879ms, source=queries/query0021.rq
        Hide
        bryanthompson bryanthompson added a comment -

        Here is a new stack trace:

             [java] Caused by: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-57985176642248691
             [java]     at com.bigdata.htree.DirectoryPage.replaceChildRef(DirectoryPage.java:1781)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:712)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:702)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:583)
             [java]     at com.bigdata.htree.HTree.insert(HTree.java:1340)
             [java]     at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:852)
        

        Merge joins and native htree were enabled. I believe that the query was:

             [java] queryString
             [java] # A variant of query0021 with optimizer disabled, hash join in the complex
             [java] # optional, and running the [name] optional last.
             [java] #
             [java] # elapsed=284401ms
             [java]
             [java] PREFIX p1: <http://www.rdfabout.com/rdf/schema/usgovt/>
             [java] PREFIX p2: <http://www.rdfabout.com/rdf/schema/vote/>
             [java] PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
             [java] PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
             [java] PREFIX hint: <http://www.bigdata.com/queryHints#>
             [java]
             [java] SELECT (SAMPLE(?_var9) AS ?_var1) ?_var2 ?_var3
             [java] WITH {
             [java]     SELECT DISTINCT ?_var3
             [java]     WHERE {
             [java]             ?_var3 rdf:type <http://www.rdfabout.com/rdf/schema/politico/Politician>.
             [java]             ?_var3 <http://www.rdfabout.com/rdf/schema/politico/hasRole> ?_var6.
             [java]             ?_var6 <http://www.rdfabout.com/rdf/schema/politico/party> "Democrat".
             [java]     }
             [java] } AS %_set1
             [java]             WHERE {
             [java]            hint:Query hint:com.bigdata.rdf.sparql.ast.QueryHints.optimizer "None" .
             [java]
             [java]                     INCLUDE %_set1 .
             [java]                     OPTIONAL {
             [java]                             ?_var10 p2:votedBy ?_var3.
             [java]                             ?_var10 rdfs:label ?_var2.
             [java]                hint:BGP hint:com.bigdata.rdf.sparql.ast.eval.hashJoin "true" .
             [java] #               hint:BGP hint:com.bigdata.bop.IPredicate.keyOrder "PCSO" .
             [java]                     }
             [java]                     OPTIONAL {
             [java]                             ?_var3 p1:name ?_var9
             [java]                     }.
             [java]             }
             [java]             GROUP BY ?_var2 ?_var3
        
        Show
        bryanthompson bryanthompson added a comment - Here is a new stack trace: [java] Caused by: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-57985176642248691 [java] at com.bigdata.htree.DirectoryPage.replaceChildRef(DirectoryPage.java:1781) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:712) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:702) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:583) [java] at com.bigdata.htree.HTree.insert(HTree.java:1340) [java] at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:852) Merge joins and native htree were enabled. I believe that the query was: [java] queryString [java] # A variant of query0021 with optimizer disabled, hash join in the complex [java] # optional, and running the [name] optional last. [java] # [java] # elapsed=284401ms [java] [java] PREFIX p1: <http://www.rdfabout.com/rdf/schema/usgovt/> [java] PREFIX p2: <http://www.rdfabout.com/rdf/schema/vote/> [java] PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> [java] PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> [java] PREFIX hint: <http://www.bigdata.com/queryHints#> [java] [java] SELECT (SAMPLE(?_var9) AS ?_var1) ?_var2 ?_var3 [java] WITH { [java] SELECT DISTINCT ?_var3 [java] WHERE { [java] ?_var3 rdf:type <http://www.rdfabout.com/rdf/schema/politico/Politician>. [java] ?_var3 <http://www.rdfabout.com/rdf/schema/politico/hasRole> ?_var6. [java] ?_var6 <http://www.rdfabout.com/rdf/schema/politico/party> "Democrat". [java] } [java] } AS %_set1 [java] WHERE { [java] hint:Query hint:com.bigdata.rdf.sparql.ast.QueryHints.optimizer "None" . [java] [java] INCLUDE %_set1 . [java] OPTIONAL { [java] ?_var10 p2:votedBy ?_var3. [java] ?_var10 rdfs:label ?_var2. [java] hint:BGP hint:com.bigdata.rdf.sparql.ast.eval.hashJoin "true" . [java] # hint:BGP hint:com.bigdata.bop.IPredicate.keyOrder "PCSO" . [java] } [java] OPTIONAL { [java] ?_var3 p1:name ?_var9 [java] }. [java] } [java] GROUP BY ?_var2 ?_var3
        Hide
        bryanthompson bryanthompson added a comment -

        I am able to replicate this error consistently against r5618 using govtrack/queries/query0021c.rq.

        I have updated both the NSS statistics view and the error reported for the query engine and I can now localize the operator in the query plan for which the error is reported. It is a HTreeHashIndexOp with bopId:=36.

        com.bigdata.bop.join.HTreeHashIndexOp[36](HTreeSolutionSetHashJoinOp[35])[
        com.bigdata.bop.BOp.bopId=36,
        com.bigdata.bop.BOp.evaluationContext=CONTROLLER,
        com.bigdata.bop.PipelineOp.maxParallel=1,
        com.bigdata.bop.PipelineOp.lastPass=true,
        com.bigdata.bop.IPredicate.relationName=[kb.lex],
        com.bigdata.bop.join.HashJoinAnnotations.joinVars=[_var3],
        namedSetRef=NamedSolutionSetRef{queryId=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,namedSet=--nsr-2,joinVars=[_var3]}]
        

        I will attach the full query plan view from the NSS from before the point where this exception occurred.

        The following is the full stack trace of the firstCause for this exception:

             [java] WARN : Haltable.java:418: com.bigdata.util.concurrent.Haltable@72a45c4e : isFirstCause=true : java.lang.Exception: task=ChunkTask{query=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,bopId=36,partitionId=-1,sinkId=37,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0}
             [java] java.lang.Exception: task=ChunkTask{query=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,bopId=36,partitionId=-1,sinkId=37,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0}
             [java]     at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1217)
             [java]     at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:758)
             [java]     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
             [java]     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
             [java]     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
             [java]     at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63)
             [java]     at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:664)
             [java]     at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
             [java]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
             [java]     at java.lang.Thread.run(Thread.java:619)
             [java] Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0}
             [java]     at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
             [java]     at java.util.concurrent.FutureTask.get(FutureTask.java:83)
             [java]     at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1197)
             [java]     ... 9 more
             [java] Caused by: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0}
             [java]     at com.bigdata.bop.join.HTreeHashJoinUtility.launderThrowable(HTreeHashJoinUtility.java:2442)
             [java]     at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:962)
             [java]     at com.bigdata.bop.join.HTreeHashIndexOp$ControllerTask.call(HTreeHashIndexOp.java:301)
             [java]     at com.bigdata.bop.join.HTreeHashIndexOp$ControllerTask.call(HTreeHashIndexOp.java:223)
             [java]     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
             [java]     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
             [java]     at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1196)
             [java]     ... 9 more
             [java] Caused by: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819
             [java]     at com.bigdata.htree.DirectoryPage.replaceChildRef(DirectoryPage.java:1781)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:712)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:702)
             [java]     at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:583)
             [java]     at com.bigdata.htree.HTree.insert(HTree.java:1340)
             [java]     at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:949)
             [java]     ... 14 more
        

        The error occurs while accepting solutions into the HTree (which is all that an HTreeHashIndexOp does
        - it builds a hash index).

        Show
        bryanthompson bryanthompson added a comment - I am able to replicate this error consistently against r5618 using govtrack/queries/query0021c.rq. I have updated both the NSS statistics view and the error reported for the query engine and I can now localize the operator in the query plan for which the error is reported. It is a HTreeHashIndexOp with bopId:=36. com.bigdata.bop.join.HTreeHashIndexOp[36](HTreeSolutionSetHashJoinOp[35])[ com.bigdata.bop.BOp.bopId=36, com.bigdata.bop.BOp.evaluationContext=CONTROLLER, com.bigdata.bop.PipelineOp.maxParallel=1, com.bigdata.bop.PipelineOp.lastPass=true, com.bigdata.bop.IPredicate.relationName=[kb.lex], com.bigdata.bop.join.HashJoinAnnotations.joinVars=[_var3], namedSetRef=NamedSolutionSetRef{queryId=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,namedSet=--nsr-2,joinVars=[_var3]}] I will attach the full query plan view from the NSS from before the point where this exception occurred. The following is the full stack trace of the firstCause for this exception: [java] WARN : Haltable.java:418: com.bigdata.util.concurrent.Haltable@72a45c4e : isFirstCause=true : java.lang.Exception: task=ChunkTask{query=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,bopId=36,partitionId=-1,sinkId=37,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0} [java] java.lang.Exception: task=ChunkTask{query=95fc16c2-e34d-4cc7-b3d7-f0a10a0e1187,bopId=36,partitionId=-1,sinkId=37,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0} [java] at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1217) [java] at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:758) [java] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) [java] at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) [java] at java.util.concurrent.FutureTask.run(FutureTask.java:138) [java] at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) [java] at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:664) [java] at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) [java] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) [java] at java.lang.Thread.run(Thread.java:619) [java] Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0} [java] at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222) [java] at java.util.concurrent.FutureTask.get(FutureTask.java:83) [java] at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1197) [java] ... 9 more [java] Caused by: java.lang.RuntimeException: cause=java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819, state=HTreeHashJoinUtility{open=true,chunkSize=1000,optional=false,filter=false,joinVars=[_var3],namespace=kb.lex,size=1144924,ivCacheSize=0,blobCacheSize=0} [java] at com.bigdata.bop.join.HTreeHashJoinUtility.launderThrowable(HTreeHashJoinUtility.java:2442) [java] at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:962) [java] at com.bigdata.bop.join.HTreeHashIndexOp$ControllerTask.call(HTreeHashIndexOp.java:301) [java] at com.bigdata.bop.join.HTreeHashIndexOp$ControllerTask.call(HTreeHashIndexOp.java:223) [java] at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) [java] at java.util.concurrent.FutureTask.run(FutureTask.java:138) [java] at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1196) [java] ... 9 more [java] Caused by: java.lang.IllegalArgumentException: Not our child : oldChildAddr=-52355840316792819 [java] at com.bigdata.htree.DirectoryPage.replaceChildRef(DirectoryPage.java:1781) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:712) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:702) [java] at com.bigdata.htree.AbstractPage.copyOnWrite(AbstractPage.java:583) [java] at com.bigdata.htree.HTree.insert(HTree.java:1340) [java] at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:949) [java] ... 14 more The error occurs while accepting solutions into the HTree (which is all that an HTreeHashIndexOp does - it builds a hash index).
        Hide
        martyncutcher martyncutcher added a comment -

        I see a different error running this query, but I believe may derive from a similar inconsistency. In this case a mismatch between the depth of a BucketPage and the number of references to it from its parent DirectoryPage.

        Caused by: java.lang.AssertionError
        at com.bigdata.htree.DirectoryPage._splitBucketPage(DirectoryPage.java:130)
        at com.bigdata.htree.BucketPage.split(BucketPage.java:1367)
        at com.bigdata.htree.HTree.insert(HTree.java:1366)
        at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:949)

        Show
        martyncutcher martyncutcher added a comment - I see a different error running this query, but I believe may derive from a similar inconsistency. In this case a mismatch between the depth of a BucketPage and the number of references to it from its parent DirectoryPage. Caused by: java.lang.AssertionError at com.bigdata.htree.DirectoryPage._splitBucketPage(DirectoryPage.java:130) at com.bigdata.htree.BucketPage.split(BucketPage.java:1367) at com.bigdata.htree.HTree.insert(HTree.java:1366) at com.bigdata.bop.join.HTreeHashJoinUtility.acceptSolutions(HTreeHashJoinUtility.java:949)
        Hide
        martyncutcher martyncutcher added a comment -

        This appears to have been resolved by a fix to DirectoryPage._addLevelForOverflow where a persistent reference was not updated alongside the runtime reference.

        Show
        martyncutcher martyncutcher added a comment - This appears to have been resolved by a fix to DirectoryPage._addLevelForOverflow where a persistent reference was not updated alongside the runtime reference.
        Hide
        bryanthompson bryanthompson added a comment -

        I can confirm that the updated HTree fixes the error for this query.

        Show
        bryanthompson bryanthompson added a comment - I can confirm that the updated HTree fixes the error for this query.

          People

          • Assignee:
            martyncutcher martyncutcher
            Reporter:
            bryanthompson bryanthompson
          • Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: