Building

Started building at 2023/02/16 15:07:41
Using pegged server, 3620 build
Calculating base
Updating mirror
Basing run on 7.5.0-3620 fe0478147a
Updating tree for run 16.02.2023-15.07
query is at 8a1c256, changes since last good build: none
gometa is at e041ad6, changes since last good build: none
ns_server is at c24130d, changes since last good build: none
couchstore is at d75745b, changes since last good build: none
forestdb is at acba458, changes since last good build: none
kv_engine is at 45f6681, changes since last good build: none
Switching indexing to unstable
indexing is at 3f1e881, changes since last good build: 
 3f1e8815 MB-55503: Implement ResumeDownloadToken cleanup
 a6613fa7 MB-55093 - resume minor bugs fixes
Switching plasma to unstable
plasma is at ae6765a, changes since last good build: 
fatal: Invalid revision range 26f8a5976be1d93a4a2e24739603f0160c8540e9..HEAD

Switching nitro to unstable
nitro is at d5f5610, changes since last good build: none
Switching gometa to master
gometa is at e041ad6, changes since last good build: none
Switching testrunner to master
Submodule 'gauntlet' (https://github.com/pavithra-mahamani/gauntlet) registered for path 'gauntlet'
Submodule 'java_sdk_client' (https://github.com/couchbaselabs/java_sdk_client) registered for path 'java_sdk_client'
Submodule 'lib/capellaAPI' (https://github.com/couchbaselabs/CapellaRESTAPIs) registered for path 'lib/capellaAPI'
Submodule 'magma_loader/DocLoader' (https://github.com/couchbaselabs/DocLoader.git) registered for path 'magma_loader/DocLoader'
Cloning into '/opt/build/testrunner/gauntlet'...
Cloning into '/opt/build/testrunner/java_sdk_client'...
Cloning into '/opt/build/testrunner/lib/capellaAPI'...
Cloning into '/opt/build/testrunner/magma_loader/DocLoader'...
Submodule path 'gauntlet': checked out '4e2424851a59c6f4b4edfdb7e36fa6a0874d6300'
Submodule path 'java_sdk_client': checked out '5dd338995c16ac2f5b187729e549b28862060732'
Submodule path 'lib/capellaAPI': checked out 'a98c80da07400e94855620675c74008da872586a'
Submodule path 'magma_loader/DocLoader': checked out 'e0d979fdcb9b1918d8e1e3787e0f431c79923758'
testrunner is at 8c74100, changes since last good build: 
 8c74100ec changed index names according to new endpoints:FTS
 a73b2ef88 Added support for checking and installing ntp on debian 10 machines
Pulling in uncommitted change 186279 at refs/changes/79/186279/5
Total 21 (delta 18), reused 21 (delta 18)
[unstable 8642ceb0] MB-55266 Invoke cleanup on RestoreShard completion
 Author: Varun Velamuri 
 Date: Tue Feb 7 14:48:50 2023 +0530
 1 file changed, 58 insertions(+)
Pulling in uncommitted change 186602 at refs/changes/02/186602/3
Total 7 (delta 3), reused 4 (delta 3)
[unstable 2c67b0eb] MB-55340: Use latest golang branch for builds
 Author: Dhananjay Kshirsagar 
 Date: Mon Feb 13 00:21:39 2023 +0530
 1 file changed, 7 insertions(+)
Pulling in uncommitted change 186798 at refs/changes/98/186798/3
[unstable 29f25aa7] MB-55266 Do transfer cleanup at transfer token level
 Author: Varun Velamuri 
 Date: Wed Feb 15 21:11:07 2023 +0530
 4 files changed, 12 insertions(+), 48 deletions(-)
Pulling in uncommitted change 186800 at refs/changes/00/186800/2
[unstable 0f95e8e3] MB-55266 Do sync cleanup of transferred data when retrying transfer
 Author: Varun Velamuri 
 Date: Wed Feb 15 21:37:42 2023 +0530
 4 files changed, 40 insertions(+), 9 deletions(-)
Pulling in uncommitted change 186605 at refs/changes/05/186605/4
Total 5 (delta 2), reused 3 (delta 2)
[unstable b00e01b] MB-55523: Do not S3 cleanup for Pause-Resume
 Author: Saptarshi Sen 
 Date: Sat Feb 11 16:34:16 2023 -0800
 3 files changed, 272 insertions(+), 36 deletions(-)
Pulling in uncommitted change 186606 at refs/changes/06/186606/4
Total 8 (delta 3), reused 5 (delta 3)
[unstable e3228e9] MB-55523: Add shard copy context logging
 Author: Saptarshi Sen 
 Date: Sat Feb 11 17:42:36 2023 -0800
 1 file changed, 61 insertions(+), 71 deletions(-)
Pulling in uncommitted change 186607 at refs/changes/07/186607/4
Total 13 (delta 7), reused 10 (delta 7)
[unstable 14cacba] MB-55523: Add DoCleanupStaging
 Author: Saptarshi Sen 
 Date: Sun Feb 12 11:36:06 2023 -0800
 3 files changed, 122 insertions(+), 21 deletions(-)
Pulling in uncommitted change 186608 at refs/changes/08/186608/4
Total 17 (delta 13), reused 17 (delta 13)
[unstable 41a9764] MB-55523: Add completion status in DoCleanup
 Author: Saptarshi Sen 
 Date: Sat Feb 11 19:36:45 2023 -0800
 2 files changed, 23 insertions(+), 15 deletions(-)
Building community edition
Building cmakefiles and deps [CE]
Building main product [CE]
Build CE finished
BUILD_ENTERPRISE empty. Building enterprise edition
Building Enterprise Edition
Building cmakefiles and deps [EE]
Building main product [EE]
Build EE finished

Testing

Started testing at 2023/02/16 16:02:02
Testing mode: sanity,unit,functional,serverless,integration
Using storage type: plasma
Setting ulimit to 200000

Simple Test

Feb 16 16:07:08 rebalance_in_with_ops (rebalance.rebalancein.RebalanceInTests) ... ok
Feb 16 16:11:00 rebalance_in_with_ops (rebalance.rebalancein.RebalanceInTests) ... ok
Feb 16 16:11:49 do_warmup_100k (memcapable.WarmUpMemcachedTest) ... ok
Feb 16 16:13:18 test_view_ops (view.createdeleteview.CreateDeleteViewTests) ... ok
Feb 16 16:14:08 b" 'stop_on_failure': 'True'}"
Feb 16 16:14:08 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t rebalance.rebalancein.RebalanceInTests.rebalance_in_with_ops,nodes_in=3,replicas=1,items=50000,get-logs-cluster-run=True,doc_ops=create;update;delete'
Feb 16 16:14:08 b"{'nodes_in': '3', 'replicas': '1', 'items': '50000', 'get-logs-cluster-run': 'True', 'doc_ops': 'create;update;delete', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 1, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'False', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_1'}"
Feb 16 16:14:08 b'-->result: '
Feb 16 16:14:08 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 1 , fail 0'
Feb 16 16:14:08 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t rebalance.rebalancein.RebalanceInTests.rebalance_in_with_ops,nodes_in=3,bucket_type=ephemeral,replicas=1,items=50000,get-logs-cluster-run=True,doc_ops=create;update;delete'
Feb 16 16:14:08 b"{'nodes_in': '3', 'bucket_type': 'ephemeral', 'replicas': '1', 'items': '50000', 'get-logs-cluster-run': 'True', 'doc_ops': 'create;update;delete', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 2, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_2'}"
Feb 16 16:14:08 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:14:08 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t memcapable.WarmUpMemcachedTest.do_warmup_100k,get-logs-cluster-run=True'
Feb 16 16:14:08 b"{'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 3, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_3'}"
Feb 16 16:14:08 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:14:08 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:14:08 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t view.createdeleteview.CreateDeleteViewTests.test_view_ops,ddoc_ops=create,test_with_view=True,num_ddocs=1,num_views_per_ddoc=10,items=1000,skip_cleanup=False,get-logs-cluster-run=True'
Feb 16 16:14:08 b"{'ddoc_ops': 'create', 'test_with_view': 'True', 'num_ddocs': '1', 'num_views_per_ddoc': '10', 'items': '1000', 'skip_cleanup': 'False', 'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 4, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_4'}"
Feb 16 16:14:08 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:14:08 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:14:08 b'summary so far suite view.createdeleteview.CreateDeleteViewTests , pass 1 , fail 0'
Feb 16 16:23:32 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t view.viewquerytests.ViewQueryTests.test_employee_dataset_startkey_endkey_queries_rebalance_in,num_nodes_to_add=1,skip_rebalance=true,docs-per-day=1,timeout=1200,get-logs-cluster-run=True'ok
Feb 16 16:24:17 test_simple_dataset_stale_queries_data_modification (view.viewquerytests.ViewQueryTests) ... ok
Feb 16 16:28:03 load_with_ops (xdcr.uniXDCR.unidirectional) ... ok
Feb 16 16:31:59 load_with_failover (xdcr.uniXDCR.unidirectional) ... ok
Feb 16 16:34:41 suite_tearDown (xdcr.uniXDCR.unidirectional) ... ok
Feb 16 16:34:41 b"{'num_nodes_to_add': '1', 'skip_rebalance': 'true', 'docs-per-day': '1', 'timeout': '1200', 'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 5, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_5'}"
Feb 16 16:34:41 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.createdeleteview.CreateDeleteViewTests , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.viewquerytests.ViewQueryTests , pass 1 , fail 0'
Feb 16 16:34:41 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t view.viewquerytests.ViewQueryTests.test_simple_dataset_stale_queries_data_modification,num-docs=1000,skip_rebalance=true,timeout=1200,get-logs-cluster-run=True'
Feb 16 16:34:41 b"{'num-docs': '1000', 'skip_rebalance': 'true', 'timeout': '1200', 'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 6, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_6'}"
Feb 16 16:34:41 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.createdeleteview.CreateDeleteViewTests , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.viewquerytests.ViewQueryTests , pass 2 , fail 0'
Feb 16 16:34:41 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t xdcr.uniXDCR.unidirectional.load_with_ops,replicas=1,items=10000,value_size=128,ctopology=chain,rdirection=unidirection,doc-ops=update-delete,get-logs-cluster-run=True'
Feb 16 16:34:41 b"{'replicas': '1', 'items': '10000', 'value_size': '128', 'ctopology': 'chain', 'rdirection': 'unidirection', 'doc-ops': 'update-delete', 'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 7, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_7'}"
Feb 16 16:34:41 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.createdeleteview.CreateDeleteViewTests , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.viewquerytests.ViewQueryTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite xdcr.uniXDCR.unidirectional , pass 1 , fail 0'
Feb 16 16:34:41 b'./testrunner -i b/resources/dev-4-nodes-xdcr.ini -p makefile=True,stop_on_failure=True,log_level=CRITICAL -t xdcr.uniXDCR.unidirectional.load_with_failover,replicas=1,items=10000,ctopology=chain,rdirection=unidirection,doc-ops=update-delete,failover=source,get-logs-cluster-run=True'
Feb 16 16:34:41 b"{'replicas': '1', 'items': '10000', 'ctopology': 'chain', 'rdirection': 'unidirection', 'doc-ops': 'update-delete', 'failover': 'source', 'get-logs-cluster-run': 'True', 'ini': 'b/resources/dev-4-nodes-xdcr.ini', 'cluster_name': 'dev-4-nodes-xdcr', 'spec': 'simple', 'conf_file': 'conf/simple.conf', 'makefile': 'True', 'stop_on_failure': 'True', 'log_level': 'CRITICAL', 'num_nodes': 4, 'case_number': 8, 'total_testcases': 8, 'last_case_fail': 'False', 'teardown_run': 'True', 'logs_folder': '/opt/build/testrunner/logs/testrunner-23-Feb-16_16-02-26/test_8'}"
Feb 16 16:34:41 b'summary so far suite rebalance.rebalancein.RebalanceInTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite memcapable.WarmUpMemcachedTest , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.createdeleteview.CreateDeleteViewTests , pass 1 , fail 0'
Feb 16 16:34:41 b'summary so far suite view.viewquerytests.ViewQueryTests , pass 2 , fail 0'
Feb 16 16:34:41 b'summary so far suite xdcr.uniXDCR.unidirectional , pass 2 , fail 0'
Feb 16 16:34:41 b'Run after suite setup for xdcr.uniXDCR.unidirectional.load_with_failover'
Feb 16 16:34:42 b"('rebalance.rebalancein.RebalanceInTests.rebalance_in_with_ops', ' pass')"
Feb 16 16:34:42 b"('rebalance.rebalancein.RebalanceInTests.rebalance_in_with_ops', ' pass')"
Feb 16 16:34:42 b"('memcapable.WarmUpMemcachedTest.do_warmup_100k', ' pass')"
Feb 16 16:34:42 b"('view.createdeleteview.CreateDeleteViewTests.test_view_ops', ' pass')"
Feb 16 16:34:42 b"('view.viewquerytests.ViewQueryTests.test_employee_dataset_startkey_endkey_queries_rebalance_in', ' pass')"
Feb 16 16:34:42 b"('view.viewquerytests.ViewQueryTests.test_simple_dataset_stale_queries_data_modification', ' pass')"
Feb 16 16:34:42 b"('xdcr.uniXDCR.unidirectional.load_with_ops', ' pass')"
Feb 16 16:34:42 b"('xdcr.uniXDCR.unidirectional.load_with_failover', ' pass')"

Unit tests

=== RUN   TestMerger
--- PASS: TestMerger (0.02s)
=== RUN   TestInsert
--- PASS: TestInsert (0.00s)
=== RUN   TestInsertPerf
16000 items took 16.407781ms -> 975147.0963684851 items/s conflicts 2
--- PASS: TestInsertPerf (0.02s)
=== RUN   TestGetPerf
16000 items took 2.273617ms -> 7.037245059304184e+06 items/s
--- PASS: TestGetPerf (0.01s)
=== RUN   TestGetRangeSplitItems
{
"node_count":             1000,
"soft_deletes":           0,
"read_conflicts":         0,
"insert_conflicts":       0,
"next_pointers_per_node": 1.3450,
"memory_used":            45520,
"node_allocs":            1000,
"node_frees":             0,
"level_node_distribution":{
"level0": 747,
"level1": 181,
"level2": 56,
"level3": 13,
"level4": 2,
"level5": 1,
"level6": 0,
"level7": 0,
"level8": 0,
"level9": 0,
"level10": 0,
"level11": 0,
"level12": 0,
"level13": 0,
"level14": 0,
"level15": 0,
"level16": 0,
"level17": 0,
"level18": 0,
"level19": 0,
"level20": 0,
"level21": 0,
"level22": 0,
"level23": 0,
"level24": 0,
"level25": 0,
"level26": 0,
"level27": 0,
"level28": 0,
"level29": 0,
"level30": 0,
"level31": 0,
"level32": 0
}
}
Split range keys [105 161 346 379 434 523 713]
No of items in each range [105 56 185 33 55 89 190 287]
--- PASS: TestGetRangeSplitItems (0.00s)
=== RUN   TestBuilder
{
"node_count":             50000,
"soft_deletes":           0,
"read_conflicts":         0,
"insert_conflicts":       0,
"next_pointers_per_node": 1.3367,
"memory_used":            2269392,
"node_allocs":            50000,
"node_frees":             0,
"level_node_distribution":{
"level0": 37380,
"level1": 9466,
"level2": 2370,
"level3": 578,
"level4": 152,
"level5": 40,
"level6": 10,
"level7": 3,
"level8": 1,
"level9": 0,
"level10": 0,
"level11": 0,
"level12": 0,
"level13": 0,
"level14": 0,
"level15": 0,
"level16": 0,
"level17": 0,
"level18": 0,
"level19": 0,
"level20": 0,
"level21": 0,
"level22": 0,
"level23": 0,
"level24": 0,
"level25": 0,
"level26": 0,
"level27": 0,
"level28": 0,
"level29": 0,
"level30": 0,
"level31": 0,
"level32": 0
}
}
Took 10.103696ms to build 50000 items, 4.948684e+06 items/sec
Took 1.017644ms to iterate 50000 items
--- PASS: TestBuilder (0.01s)
=== RUN   TestNodeDCAS
--- PASS: TestNodeDCAS (0.00s)
PASS
ok  	github.com/couchbase/nitro/skiplist	0.072s
=== RUN   TestZstdSimple
--- PASS: TestZstdSimple (0.00s)
=== RUN   TestZstdCompressBound
--- PASS: TestZstdCompressBound (3.02s)
=== RUN   TestZstdErrors
--- PASS: TestZstdErrors (0.00s)
=== RUN   TestZstdCompressLevels
--- PASS: TestZstdCompressLevels (0.76s)
=== RUN   TestZstdEmptySrc
--- PASS: TestZstdEmptySrc (0.00s)
=== RUN   TestZstdLargeSrc
--- PASS: TestZstdLargeSrc (0.00s)
PASS
ok  	github.com/couchbase/plasma/zstd	3.796s
go: downloading github.com/couchbase/tools-common v0.0.0-20221108111232-74639726fb4d
go: downloading github.com/aws/aws-sdk-go v1.44.105
=== RUN   TestAutoTunerWriteUsageStats
--- PASS: TestAutoTunerWriteUsageStats (12.01s)
=== RUN   TestAutoTunerReadUsageStats
--- PASS: TestAutoTunerReadUsageStats (8.60s)
=== RUN   TestAutoTunerCleanerUsageStats
--- PASS: TestAutoTunerCleanerUsageStats (9.56s)
=== RUN   TestAutoTunerDiskStats
--- PASS: TestAutoTunerDiskStats (2.50s)
=== RUN   TestAutoTunerTargetFragRatio
--- PASS: TestAutoTunerTargetFragRatio (0.00s)
=== RUN   TestAutoTunerExcessUsedSpace
--- PASS: TestAutoTunerExcessUsedSpace (0.00s)
=== RUN   TestAutoTunerUsedSpaceRatio
--- PASS: TestAutoTunerUsedSpaceRatio (0.00s)
=== RUN   TestAutoTunerAdjustFragRatio
--- PASS: TestAutoTunerAdjustFragRatio (0.00s)
=== RUN   TestAutoTuneFlushBufferAdjustMemQuotaSingleShard
--- PASS: TestAutoTuneFlushBufferAdjustMemQuotaSingleShard (17.49s)
=== RUN   TestAutoTuneFlushBufferAdjustMemQuotaManyShards
--- PASS: TestAutoTuneFlushBufferAdjustMemQuotaManyShards (15.72s)
=== RUN   TestAutoTuneFlushBufferRebalanceIdleShards
--- PASS: TestAutoTuneFlushBufferRebalanceIdleShards (13.71s)
=== RUN   TestAutoTuneFlushBufferGetUsedMemory
--- PASS: TestAutoTuneFlushBufferGetUsedMemory (25.96s)
=== RUN   TestBloom
--- PASS: TestBloom (4.98s)
=== RUN   TestBloomDisableEnable
--- PASS: TestBloomDisableEnable (3.85s)
=== RUN   TestBloomDisable
--- PASS: TestBloomDisable (0.04s)
=== RUN   TestBloomFreeDuringLookup
--- PASS: TestBloomFreeDuringLookup (0.03s)
=== RUN   TestBloomRecoveryFreeDuringLookup
--- PASS: TestBloomRecoveryFreeDuringLookup (0.12s)
=== RUN   TestBloomRecoverySwapInLookup
--- PASS: TestBloomRecoverySwapInLookup (0.07s)
=== RUN   TestBloomRecoverySwapOutLookup
--- PASS: TestBloomRecoverySwapOutLookup (0.11s)
=== RUN   TestBloomRecoveryInserts
--- PASS: TestBloomRecoveryInserts (0.07s)
=== RUN   TestBloomRecovery
--- PASS: TestBloomRecovery (0.14s)
=== RUN   TestBloomStats
--- PASS: TestBloomStats (3.80s)
=== RUN   TestBloomStatsRecovery
--- PASS: TestBloomStatsRecovery (1.37s)
=== RUN   TestBloomFilterSimple
--- PASS: TestBloomFilterSimple (0.00s)
=== RUN   TestBloomFilterConcurrent
--- PASS: TestBloomFilterConcurrent (21.66s)
=== RUN   TestBitArrayConcurrent
--- PASS: TestBitArrayConcurrent (1.17s)
=== RUN   TestBloomCapacity
--- PASS: TestBloomCapacity (0.00s)
=== RUN   TestBloomNumHashFuncs
--- PASS: TestBloomNumHashFuncs (0.00s)
=== RUN   TestBloomTestAndAdd
--- PASS: TestBloomTestAndAdd (0.21s)
=== RUN   TestBloomReset
--- PASS: TestBloomReset (0.00s)
=== RUN   TestLFSCopier
--- PASS: TestLFSCopier (0.00s)
=== RUN   TestLFSCopierNumBytes
--- PASS: TestLFSCopierNumBytes (0.01s)
=== RUN   TestCopierRestoreFile
--- PASS: TestCopierRestoreFile (0.02s)
=== RUN   TestCopierUploadDownloadBytes
--- PASS: TestCopierUploadDownloadBytes (0.00s)
=== RUN   TestS3CopierKeyEncoding
--- PASS: TestS3CopierKeyEncoding (0.03s)
=== RUN   TestSBCopyConcurrent
--- PASS: TestSBCopyConcurrent (0.21s)
=== RUN   TestSBCopyCorrupt
--- PASS: TestSBCopyCorrupt (0.01s)
=== RUN   TestLSSCopyHeadTailSingleSegment
--- PASS: TestLSSCopyHeadTailSingleSegment (0.02s)
=== RUN   TestLSSCopyFullSegments
--- PASS: TestLSSCopyFullSegments (0.56s)
=== RUN   TestLSSCopyPartialSegments
--- PASS: TestLSSCopyPartialSegments (1.17s)
=== RUN   TestLSSCopyHolePunching
--- PASS: TestLSSCopyHolePunching (0.59s)
=== RUN   TestLSSCopyConcurrent
--- PASS: TestLSSCopyConcurrent (0.79s)
=== RUN   TestLSSCopyS3MultipartHeapUsage
--- PASS: TestLSSCopyS3MultipartHeapUsage (0.00s)
=== RUN   TestShardCopySimple
--- PASS: TestShardCopySimple (0.36s)
=== RUN   TestShardCopyWithTimeout
--- PASS: TestShardCopyWithTimeout (0.00s)
=== RUN   TestShardCopyNoRP
--- PASS: TestShardCopyNoRP (0.16s)
=== RUN   TestShardCopyMetadataCorrupted
--- PASS: TestShardCopyMetadataCorrupted (0.04s)
=== RUN   TestShardCopyMetadataCorruptedII
--- PASS: TestShardCopyMetadataCorruptedII (0.07s)
=== RUN   TestShardCopyLSSMetadataCorrupted
--- PASS: TestShardCopyLSSMetadataCorrupted (0.07s)
=== RUN   TestShardCopyBeforeRecovery
--- PASS: TestShardCopyBeforeRecovery (0.00s)
=== RUN   TestShardCopySkipLog
--- PASS: TestShardCopySkipLog (1.06s)
=== RUN   TestShardCopyAddInstance
--- PASS: TestShardCopyAddInstance (2.42s)
=== RUN   TestShardCopyDestroyInstance
--- PASS: TestShardCopyDestroyInstance (0.34s)
=== RUN   TestShardCopyRestoreShard
--- PASS: TestShardCopyRestoreShard (18.29s)
=== RUN   TestShardCopyRestoreManyShards
--- PASS: TestShardCopyRestoreManyShards (9.77s)
=== RUN   TestShardCopyRestoreConcurrentLogCleaning
--- PASS: TestShardCopyRestoreConcurrentLogCleaning (21.92s)
=== RUN   TestShardCopyRestorePartialRollback
--- PASS: TestShardCopyRestorePartialRollback (19.58s)
=== RUN   TestInvalidMVCCRollback
--- PASS: TestInvalidMVCCRollback (0.31s)
=== RUN   TestShardCopyRestoreConcurrentPurges
--- PASS: TestShardCopyRestoreConcurrentPurges (20.05s)
=== RUN   TestShardCopyRestoreRepairHdr
--- PASS: TestShardCopyRestoreRepairHdr (91.75s)
=== RUN   TestShardCopyDuplicateIndex
--- PASS: TestShardCopyDuplicateIndex (0.17s)
=== RUN   TestTenantCopy
--- PASS: TestTenantCopy (5.18s)
=== RUN   TestLockShardAddInstance
--- PASS: TestLockShardAddInstance (0.20s)
=== RUN   TestLockShardAddInstanceMapping
--- PASS: TestLockShardAddInstanceMapping (0.27s)
=== RUN   TestLockShardCloseInstance
--- PASS: TestLockShardCloseInstance (0.37s)
=== RUN   TestLockShardEmptyShard
--- PASS: TestLockShardEmptyShard (0.17s)
=== RUN   TestLockShardUseShardId
--- PASS: TestLockShardUseShardId (0.25s)
=== RUN   TestLockShardUseShardIdII
--- PASS: TestLockShardUseShardIdII (0.05s)
=== RUN   TestDestroyShardID
--- PASS: TestDestroyShardID (0.73s)
=== RUN   TestDestroyShardIDConcurrent
--- PASS: TestDestroyShardIDConcurrent (0.18s)
=== RUN   TestDestroyShardIDNumTenants
--- PASS: TestDestroyShardIDNumTenants (0.56s)
=== RUN   TestDestroyShardIDTenantAddRemove
--- PASS: TestDestroyShardIDTenantAddRemove (0.22s)
=== RUN   TestDestroyShardIDOnRestoreFailure
--- PASS: TestDestroyShardIDOnRestoreFailure (0.73s)
=== RUN   TestDestroyShardIDOnRestoreFailure2
--- PASS: TestDestroyShardIDOnRestoreFailure2 (0.66s)
=== RUN   TestTransferShardAPI
--- PASS: TestTransferShardAPI (2.93s)
=== RUN   TestTransferShardAPICreateIndexes
--- PASS: TestTransferShardAPICreateIndexes (19.50s)
=== RUN   TestTransferShardAPIWithDropIndexes
--- PASS: TestTransferShardAPIWithDropIndexes (6.03s)
=== RUN   TestTransferShardAPIWithCancel
--- PASS: TestTransferShardAPIWithCancel (6.76s)
=== RUN   TestTransferShardAPIWithCleanup
--- PASS: TestTransferShardAPIWithCleanup (87.73s)
=== RUN   TestRestoreShardAPI
--- PASS: TestRestoreShardAPI (1.00s)
=== RUN   TestRestoreShardNumShards
--- PASS: TestRestoreShardNumShards (0.23s)
=== RUN   TestRestoreShardInvalidLocation
--- PASS: TestRestoreShardInvalidLocation (0.11s)
=== RUN   TestRestoreShardReplicaRepair
--- PASS: TestRestoreShardReplicaRepair (20.11s)
=== RUN   TestRestoreShardDone
--- PASS: TestRestoreShardDone (0.66s)
=== RUN   TestRestoreShardDoneConcurrent
--- PASS: TestRestoreShardDoneConcurrent (1.58s)
=== RUN   TestRestoreShardDoneDroppedInstances
--- PASS: TestRestoreShardDoneDroppedInstances (1.25s)
=== RUN   TestShardDoCleanupAPI
--- PASS: TestShardDoCleanupAPI (1.25s)
=== RUN   TestShardDoCleanupAPI2
--- PASS: TestShardDoCleanupAPI2 (25.29s)
=== RUN   TestRestoreShardPauseResume
github.com/couchbase/plasma.dumpStack()
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:352 +0x45
github.com/couchbase/plasma.runTest.func4({0xbd92af, 0xe})
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:431 +0xbd
github.com/couchbase/plasma.runTest(0xc00413c000, {0xbe5ccb, 0x1b}, 0xc381a8, {0xbd51e9, 0x7}, 0x0, 0x0)
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:443 +0x423
github.com/couchbase/plasma.TestRestoreShardPauseResume(0x1309a10?)
	/opt/build/goproj/src/github.com/couchbase/plasma/copier_test.go:7600 +0x3f
github.com/couchbase/plasma.(*smrManager).run(0xc000353b30)
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:526 +0x96
created by github.com/couchbase/plasma.NewSmrManager
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:435 +0xaf
github.com/couchbase/plasma.runCleanerAutoTuner()
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:653 +0x194
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:239 +0x1fa
github.com/couchbase/plasma.singletonWorker()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:5021 +0xb0
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:240 +0x206
github.com/couchbase/plasma.systemResourceTracker()
	/opt/build/goproj/src/github.com/couchbase/plasma/mem.go:542 +0xa5
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:241 +0x212
github.com/couchbase/plasma.AggregateAndLogStats()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:4982 +0x147
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:242 +0x21e
github.com/couchbase/plasma.(*CleanerAutoTuner).refreshCleanerBandwidth(0xc000010500)
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:664 +0x86
created by github.com/couchbase/plasma.runCleanerAutoTuner
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:650 +0x136
github.com/couchbase/plasma.(*TenantMgr).Run(0xc00017b400)
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:425 +0x9d
created by github.com/couchbase/plasma.init.3
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:57 +0x5d
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002ea400)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc00018e900)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc00018e900)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc00018e000)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002ead00)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002eb600)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).monitor(0xc00018e000)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002eb600)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002ead00)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002ea400)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.dumpStack()
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:352 +0x45
github.com/couchbase/plasma.runTest.func4({0xbd86bb, 0xd})
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:431 +0xbd
github.com/couchbase/plasma.runTest(0xc00413c000, {0xbe5ccb, 0x1b}, 0xc381a8, {0xbd51e9, 0x7}, 0x0, 0x0)
	/opt/build/goproj/src/github.com/couchbase/plasma/testing.go:446 +0x46c
github.com/couchbase/plasma.TestRestoreShardPauseResume(0x1309a10?)
	/opt/build/goproj/src/github.com/couchbase/plasma/copier_test.go:7600 +0x3f
github.com/couchbase/plasma.(*smrManager).run(0xc000353b30)
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:526 +0x96
created by github.com/couchbase/plasma.NewSmrManager
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:435 +0xaf
github.com/couchbase/plasma.runCleanerAutoTuner()
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:653 +0x194
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:239 +0x1fa
github.com/couchbase/plasma.singletonWorker()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:5021 +0xb0
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:240 +0x206
github.com/couchbase/plasma.systemResourceTracker()
	/opt/build/goproj/src/github.com/couchbase/plasma/mem.go:542 +0xa5
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:241 +0x212
github.com/couchbase/plasma.AggregateAndLogStats()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:4982 +0x147
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:242 +0x21e
github.com/couchbase/plasma.(*CleanerAutoTuner).refreshCleanerBandwidth(0xc000010500)
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:664 +0x86
created by github.com/couchbase/plasma.runCleanerAutoTuner
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:650 +0x136
github.com/couchbase/plasma.(*TenantMgr).Run(0xc00017b400)
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:425 +0x9d
created by github.com/couchbase/plasma.init.3
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:57 +0x5d
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002ea400)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc00018e900)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc00018e900)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc00018e000)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002ead00)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002eb600)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).monitor(0xc00018e000)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002eb600)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
github.com/couchbase/plasma.(*Plasma).monitor(0xc0002ead00)
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:554 +0x3a5
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:517 +0x25d
github.com/couchbase/plasma.(*Plasma).mvccPurgerDaemon(0xc0002ea400)
	/opt/build/goproj/src/github.com/couchbase/plasma/mvcc_purger.go:132 +0x2c7
created by github.com/couchbase/plasma.(*Plasma).start
	/opt/build/goproj/src/github.com/couchbase/plasma/plasma.go:508 +0x1c5
--- FAIL: TestRestoreShardPauseResume (0.48s)
=== RUN   TestDiag
--- PASS: TestDiag (0.72s)
=== RUN   TestDumpLog
--- PASS: TestDumpLog (0.08s)
=== RUN   TestExtrasN1
=== RUN   TestExtrasN2
=== RUN   TestExtrasN3
=== RUN   TestGMRecovery
--- PASS: TestGMRecovery (13.38s)
=== RUN   TestIteratorSimple
--- PASS: TestIteratorSimple (4.89s)
=== RUN   TestIteratorSeek
--- PASS: TestIteratorSeek (6.07s)
=== RUN   TestPlasmaIteratorSeekFirst
--- PASS: TestPlasmaIteratorSeekFirst (0.55s)
=== RUN   TestPlasmaIteratorSwapin
--- PASS: TestPlasmaIteratorSwapin (5.26s)
=== RUN   TestIteratorSetEnd
--- PASS: TestIteratorSetEnd (0.78s)
=== RUN   TestIterHiItm
--- PASS: TestIterHiItm (2.76s)
=== RUN   TestIterDeleteSplitMerge
--- PASS: TestIterDeleteSplitMerge (0.04s)
=== RUN   TestKeySamplingSingle
--- PASS: TestKeySamplingSingle (0.14s)
=== RUN   TestKeySamplingAll
--- PASS: TestKeySamplingAll (0.17s)
=== RUN   TestKeySamplingEmpty
--- PASS: TestKeySamplingEmpty (0.05s)
=== RUN   TestKeySamplingExceed
--- PASS: TestKeySamplingExceed (0.15s)
=== RUN   TestLogOperation
--- PASS: TestLogOperation (59.37s)
=== RUN   TestLogLargeSize
--- PASS: TestLogLargeSize (0.16s)
=== RUN   TestLogTrim
--- PASS: TestLogTrim (60.54s)
=== RUN   TestLogSuperblockCorruption
--- PASS: TestLogSuperblockCorruption (59.41s)
=== RUN   TestLogTrimHolePunch
--- PASS: TestLogTrimHolePunch (49.57s)
=== RUN   TestLogMissingAndTruncatedSegments
--- PASS: TestLogMissingAndTruncatedSegments (0.07s)
=== RUN   TestLogReadBeyondMaxFileIndex
--- PASS: TestLogReadBeyondMaxFileIndex (2.57s)
=== RUN   TestLogReadEOFWithMMap
--- PASS: TestLogReadEOFWithMMap (0.00s)
=== RUN   TestShardLSSCleaning
--- PASS: TestShardLSSCleaning (0.28s)
=== RUN   TestShardLSSCleaningDeleteInstance
--- PASS: TestShardLSSCleaningDeleteInstance (0.24s)
=== RUN   TestShardLSSCleaningCorruptInstance
--- PASS: TestShardLSSCleaningCorruptInstance (0.23s)
=== RUN   TestPlasmaLSSCleaner
--- PASS: TestPlasmaLSSCleaner (218.58s)
=== RUN   TestPlasmaLSSCleanerMinSize
--- PASS: TestPlasmaLSSCleanerMinSize (12.20s)
=== RUN   TestLSSBasic
--- PASS: TestLSSBasic (0.08s)
=== RUN   TestLSSConcurrent
--- PASS: TestLSSConcurrent (0.86s)
=== RUN   TestLSSCleaner
--- PASS: TestLSSCleaner (12.39s)
=== RUN   TestLSSSuperBlock
--- PASS: TestLSSSuperBlock (1.05s)
=== RUN   TestLSSLargeSinglePayload
--- PASS: TestLSSLargeSinglePayload (0.83s)
=== RUN   TestLSSUnstableEnvironment
--- PASS: TestLSSUnstableEnvironment (10.24s)
=== RUN   TestLSSSmallFlushBuffer
--- PASS: TestLSSSmallFlushBuffer (0.01s)
=== RUN   TestLSSTrimFlushBufferGC
--- PASS: TestLSSTrimFlushBufferGC (1.54s)
=== RUN   TestLSSTrimFlushBufferNoIO
--- PASS: TestLSSTrimFlushBufferNoIO (30.01s)
=== RUN   TestLSSTrimFlushBufferWithIO
--- PASS: TestLSSTrimFlushBufferWithIO (33.15s)
=== RUN   TestLSSExtendFlushBufferWithIO
--- PASS: TestLSSExtendFlushBufferWithIO (30.02s)
=== RUN   TestLSSCtxTrimFlushBuffer
--- PASS: TestLSSCtxTrimFlushBuffer (5.46s)
=== RUN   TestLSSNegativeGetFlushBufferMemory
--- PASS: TestLSSNegativeGetFlushBufferMemory (0.01s)
=== RUN   TestLSSNegativeGetFlushBufferMemoryII
--- PASS: TestLSSNegativeGetFlushBufferMemoryII (0.01s)
=== RUN   TestMem
Plasma: Adaptive memory quota tuning (decrementing): RSS:925540352, freePercent:88.8287563961393, currentQuota=1099511627776, newQuota=1073741824, netGrowth=0, percent=99Plasma: Adaptive memory quota tuning (incrementing): RSS:925114368, freePercent: 88.83076905150438, currentQuota=0, newQuota=10995116277--- PASS: TestMem (15.02s)
=== RUN   TestCpu
--- PASS: TestCpu (14.68s)
=== RUN   TestTopTen20
--- PASS: TestTopTen20 (0.83s)
=== RUN   TestTopTen5
--- PASS: TestTopTen5 (0.20s)
=== RUN   TestMVCCSimple
--- PASS: TestMVCCSimple (0.30s)
=== RUN   TestMVCCLookup
--- PASS: TestMVCCLookup (0.17s)
=== RUN   TestMVCCIteratorRefresh
--- PASS: TestMVCCIteratorRefresh (5.34s)
=== RUN   TestMVCCIteratorRefreshEveryRow
--- PASS: TestMVCCIteratorRefreshEveryRow (1.35s)
=== RUN   TestMVCCGarbageCollection
--- PASS: TestMVCCGarbageCollection (0.12s)
=== RUN   TestMVCCRecoveryPoint
--- PASS: TestMVCCRecoveryPoint (2.80s)
=== RUN   TestMVCCRollbackMergeSibling
--- PASS: TestMVCCRollbackMergeSibling (0.07s)
=== RUN   TestMVCCRollbackCompact
--- PASS: TestMVCCRollbackCompact (0.07s)
=== RUN   TestMVCCRollbackSplit
--- PASS: TestMVCCRollbackSplit (0.07s)
=== RUN   TestMVCCRollbackItemsNotInSnapshot
--- PASS: TestMVCCRollbackItemsNotInSnapshot (0.20s)
=== RUN   TestMVCCRecoveryPointRollbackedSnapshot
--- PASS: TestMVCCRecoveryPointRollbackedSnapshot (1.37s)
=== RUN   TestMVCCRollbackBetweenRecoveryPoint
--- PASS: TestMVCCRollbackBetweenRecoveryPoint (1.35s)
=== RUN   TestMVCCRecoveryPointCrash
--- PASS: TestMVCCRecoveryPointCrash (0.11s)
=== RUN   TestMVCCIntervalGC
--- PASS: TestMVCCIntervalGC (0.27s)
=== RUN   TestMVCCItemsCount
--- PASS: TestMVCCItemsCount (0.48s)
=== RUN   TestLargeItems
--- PASS: TestLargeItems (108.65s)
=== RUN   TestTooLargeKey
--- PASS: TestTooLargeKey (3.57s)
=== RUN   TestMVCCItemUpdateSize
--- PASS: TestMVCCItemUpdateSize (0.33s)
=== RUN   TestEvictionStats
--- PASS: TestEvictionStats (0.67s)
=== RUN   TestReaderCacheStats
--- PASS: TestReaderCacheStats (1.20s)
=== RUN   TestInvalidSnapshot
--- PASS: TestInvalidSnapshot (1.41s)
=== RUN   TestEmptyKeyInsert
--- PASS: TestEmptyKeyInsert (0.04s)
=== RUN   TestMVCCRecoveryPointError
--- PASS: TestMVCCRecoveryPointError (0.04s)
=== RUN   TestMVCCReaderPurgeSequential
--- PASS: TestMVCCReaderPurgeSequential (0.31s)
=== RUN   TestMVCCReaderNoPurge
--- PASS: TestMVCCReaderNoPurge (0.31s)
=== RUN   TestMVCCReaderPurgeAfterUpdate
--- PASS: TestMVCCReaderPurgeAfterUpdate (0.31s)
=== RUN   TestMVCCReaderPurgeAfterRollback
--- PASS: TestMVCCReaderPurgeAfterRollback (0.33s)
=== RUN   TestMVCCReaderPurgeSimple
--- PASS: TestMVCCReaderPurgeSimple (0.07s)
=== RUN   TestMVCCReaderPurgeRandom
--- PASS: TestMVCCReaderPurgeRandom (0.31s)
=== RUN   TestMVCCReaderPurgePageFlag
--- PASS: TestMVCCReaderPurgePageFlag (0.15s)
=== RUN   TestMVCCPurgeRatioWithRollback
--- PASS: TestMVCCPurgeRatioWithRollback (16.34s)
=== RUN   TestComputeItemsCountMVCCWithRollbackI
--- PASS: TestComputeItemsCountMVCCWithRollbackI (0.13s)
=== RUN   TestComputeItemsCountMVCCWithRollbackII
--- PASS: TestComputeItemsCountMVCCWithRollbackII (0.05s)
=== RUN   TestComputeItemsCountMVCCWithRollbackIII
--- PASS: TestComputeItemsCountMVCCWithRollbackIII (0.09s)
=== RUN   TestComputeItemsCountMVCCWithRollbackIV
--- PASS: TestComputeItemsCountMVCCWithRollbackIV (0.09s)
=== RUN   TestMVCCPurgedRecordsWithCompactFullMarshalAndCascadedEmptyPagesMerge
--- PASS: TestMVCCPurgedRecordsWithCompactFullMarshalAndCascadedEmptyPagesMerge (2.70s)
=== RUN   TestMaxDeltaChainLenWithCascadedEmptyPagesMerge
--- PASS: TestMaxDeltaChainLenWithCascadedEmptyPagesMerge (2.54s)
=== RUN   TestAutoHoleCleaner
--- PASS: TestAutoHoleCleaner (46.01s)
=== RUN   TestAutoHoleCleaner5Indexes
--- PASS: TestAutoHoleCleaner5Indexes (284.11s)
=== RUN   TestIteratorReportedHoleRegionBoundary
--- PASS: TestIteratorReportedHoleRegionBoundary (0.16s)
=== RUN   TestFullRangeHoleScans
--- PASS: TestFullRangeHoleScans (0.53s)
=== RUN   TestOverlappingRangeHoleScans
--- PASS: TestOverlappingRangeHoleScans (0.52s)
=== RUN   TestMVCCIteratorSMRRefreshOnHoleScan
--- PASS: TestMVCCIteratorSMRRefreshOnHoleScan (12.62s)
=== RUN   TestAutoHoleCleanerWithRecovery
--- PASS: TestAutoHoleCleanerWithRecovery (2.48s)
=== RUN   TestPageMergeCorrectness2
--- PASS: TestPageMergeCorrectness2 (0.00s)
=== RUN   TestPageMergeCorrectness
--- PASS: TestPageMergeCorrectness (0.01s)
=== RUN   TestPageMarshalFull
--- PASS: TestPageMarshalFull (0.01s)
=== RUN   TestPageMergeMarshal
--- PASS: TestPageMergeMarshal (0.00s)
=== RUN   TestPageOperations
--- PASS: TestPageOperations (0.03s)
=== RUN   TestPageIterator
--- PASS: TestPageIterator (0.00s)
=== RUN   TestPageMarshal
--- PASS: TestPageMarshal (0.02s)
=== RUN   TestPageMergeCorrectness3
--- PASS: TestPageMergeCorrectness3 (0.00s)
=== RUN   TestPageHasDataRecords
--- PASS: TestPageHasDataRecords (0.00s)
=== RUN   TestPlasmaPageVisitor
--- PASS: TestPlasmaPageVisitor (4.65s)
=== RUN   TestPageRingVisitor
--- PASS: TestPageRingVisitor (4.43s)
=== RUN   TestPauseVisitorOnLowMemory
--- PASS: TestPauseVisitorOnLowMemory (1.17s)
=== RUN   TestCheckpointRecovery
--- PASS: TestCheckpointRecovery (13.34s)
=== RUN   TestPageCorruption
--- PASS: TestPageCorruption (1.31s)
=== RUN   TestCheckPointRecoveryFollowCleaning
--- PASS: TestCheckPointRecoveryFollowCleaning (0.12s)
=== RUN   TestFragmentationWithZeroItems
--- PASS: TestFragmentationWithZeroItems (1.14s)
=== RUN   TestEvictOnPersist
--- PASS: TestEvictOnPersist (0.21s)
=== RUN   TestPlasmaSimple
--- PASS: TestPlasmaSimple (13.78s)
=== RUN   TestPlasmaCompression
--- PASS: TestPlasmaCompression (0.04s)
=== RUN   TestPlasmaCompressionWrong
--- PASS: TestPlasmaCompressionWrong (0.04s)
=== RUN   TestPlasmaInMemCompression
--- PASS: TestPlasmaInMemCompression (0.03s)
=== RUN   TestPlasmaInMemCompressionZstd
--- PASS: TestPlasmaInMemCompressionZstd (0.04s)
=== RUN   TestPlasmaInMemCompressionWrong
--- PASS: TestPlasmaInMemCompressionWrong (0.04s)
=== RUN   TestSpoiledConfig
--- PASS: TestSpoiledConfig (0.06s)
=== RUN   TestPlasmaErrorFile
--- PASS: TestPlasmaErrorFile (0.03s)
=== RUN   TestPlasmaPersistor
--- PASS: TestPlasmaPersistor (9.98s)
=== RUN   TestPlasmaEvictionLSSDataSize
--- PASS: TestPlasmaEvictionLSSDataSize (0.04s)
=== RUN   TestPlasmaEviction
--- PASS: TestPlasmaEviction (30.77s)
=== RUN   TestConcurrDelOps
--- PASS: TestConcurrDelOps (68.07s)
=== RUN   TestPlasmaDataSize
--- PASS: TestPlasmaDataSize (0.04s)
=== RUN   TestLargeBasePage
--- PASS: TestLargeBasePage (41.72s)
=== RUN   TestLargeValue
--- PASS: TestLargeValue (97.59s)
=== RUN   TestPlasmaTooLargeKey
--- PASS: TestPlasmaTooLargeKey (3.53s)
=== RUN   TestEvictAfterMerge
--- PASS: TestEvictAfterMerge (0.16s)
=== RUN   TestEvictDirty
--- PASS: TestEvictDirty (0.22s)
=== RUN   TestEvictUnderQuota
--- PASS: TestEvictUnderQuota (60.17s)
=== RUN   TestEvictSetting
--- PASS: TestEvictSetting (1.28s)
=== RUN   TestBasePageAfterCompaction
--- PASS: TestBasePageAfterCompaction (0.17s)
=== RUN   TestSwapout
--- PASS: TestSwapout (0.03s)
=== RUN   TestSwapoutSplitBasePage
--- PASS: TestSwapoutSplitBasePage (0.04s)
=== RUN   TestCompactFullMarshal
--- PASS: TestCompactFullMarshal (0.08s)
=== RUN   TestPageStats
--- PASS: TestPageStats (3.03s)
=== RUN   TestPageStatsTinyIndex
--- PASS: TestPageStatsTinyIndex (0.14s)
=== RUN   TestPageStatsTinyIndexOnRecovery
--- PASS: TestPageStatsTinyIndexOnRecovery (0.09s)
=== RUN   TestPageStatsTinyIndexOnSplitAndMerge
--- PASS: TestPageStatsTinyIndexOnSplitAndMerge (0.04s)
=== RUN   TestPageCompress
--- PASS: TestPageCompress (0.07s)
=== RUN   TestPageCompressSwapin
--- PASS: TestPageCompressSwapin (0.05s)
=== RUN   TestPageCompressStats
--- PASS: TestPageCompressStats (1.14s)
=== RUN   TestPageDecompressStats
--- PASS: TestPageDecompressStats (0.05s)
=== RUN   TestSharedDedicatedDataSize
--- PASS: TestSharedDedicatedDataSize (5.10s)
=== RUN   TestLastRpSns
--- PASS: TestLastRpSns (0.05s)
=== RUN   TestPageCompressState
--- PASS: TestPageCompressState (0.06s)
=== RUN   TestPageCompressDuringBurst
--- PASS: TestPageCompressDuringBurst (0.06s)
=== RUN   TestPageDontDecompressDuringScan
--- PASS: TestPageDontDecompressDuringScan (0.16s)
=== RUN   TestPageDecompressAndCompressSwapin
--- PASS: TestPageDecompressAndCompressSwapin (2.07s)
=== RUN   TestPageCompressibleStat
--- PASS: TestPageCompressibleStat (0.57s)
=== RUN   TestPageCompressibleStatRecovery
--- PASS: TestPageCompressibleStatRecovery (0.17s)
=== RUN   TestPageCompressBeforeEvictPercent
--- PASS: TestPageCompressBeforeEvictPercent (1.21s)
=== RUN   TestPageCompressDecompressAfterDisable
--- PASS: TestPageCompressDecompressAfterDisable (1.19s)
=== RUN   TestPageKeepSwapinChainCompressed
--- PASS: TestPageKeepSwapinChainCompressed (0.09s)
=== RUN   TestEnforceSameCompressionAlgo
--- PASS: TestEnforceSameCompressionAlgo (0.22s)
=== RUN   TestPageCompressChangeAlgo
--- PASS: TestPageCompressChangeAlgo (0.74s)
=== RUN   TestWrittenDataSz
--- PASS: TestWrittenDataSz (3.88s)
=== RUN   TestWrittenDataSzAfterRecoveryCleaning
--- PASS: TestWrittenDataSzAfterRecoveryCleaning (4.50s)
=== RUN   TestWrittenHdrSz
--- PASS: TestWrittenHdrSz (3.81s)
=== RUN   TestPersistConfigUpgrade
--- PASS: TestPersistConfigUpgrade (0.01s)
=== RUN   TestLSSSegmentSize
--- PASS: TestLSSSegmentSize (0.22s)
=== RUN   TestPlasmaFlushBufferSzCfg
--- PASS: TestPlasmaFlushBufferSzCfg (0.17s)
=== RUN   TestCompactionCountwithCompactFullMarshal
--- PASS: TestCompactionCountwithCompactFullMarshal (0.08s)
=== RUN   TestCompactionCountwithCompactFullMarshalSMO
--- PASS: TestCompactionCountwithCompactFullMarshalSMO (0.05s)
=== RUN   TestPageHasDataRecordsOnCompactFullMarshal
--- PASS: TestPageHasDataRecordsOnCompactFullMarshal (0.08s)
=== RUN   TestPauseReaderOnLowMemory
--- PASS: TestPauseReaderOnLowMemory (1.05s)
=== RUN   TestRecoveryCleanerFragRatio
--- PASS: TestRecoveryCleanerFragRatio (219.93s)
=== RUN   TestRecoveryCleanerRelocation
--- PASS: TestRecoveryCleanerRelocation (219.82s)
=== RUN   TestRecoveryCleanerDataSize
--- PASS: TestRecoveryCleanerDataSize (219.70s)
=== RUN   TestRecoveryCleanerDeleteInstance
--- PASS: TestRecoveryCleanerDeleteInstance (442.70s)
=== RUN   TestRecoveryCleanerRecoveryPoint
--- PASS: TestRecoveryCleanerRecoveryPoint (42.24s)
=== RUN   TestRecoveryCleanerCorruptInstance
--- PASS: TestRecoveryCleanerCorruptInstance (0.21s)
=== RUN   TestRecoveryCleanerAhead
--- PASS: TestRecoveryCleanerAhead (4.25s)
=== RUN   TestRecoveryCleanerAheadAfterRecovery
--- PASS: TestRecoveryCleanerAheadAfterRecovery (2.29s)
=== RUN   TestCleaningUncommittedData
--- PASS: TestCleaningUncommittedData (0.06s)
=== RUN   TestPlasmaRecoverySimple
--- PASS: TestPlasmaRecoverySimple (0.05s)
=== RUN   TestPlasmaRecovery
--- PASS: TestPlasmaRecovery (39.17s)
=== RUN   TestShardRecoveryShared
--- PASS: TestShardRecoveryShared (15.77s)
=== RUN   TestShardRecoveryRecoveryLogAhead
--- PASS: TestShardRecoveryRecoveryLogAhead (48.62s)
=== RUN   TestShardRecoveryDataLogAhead
--- PASS: TestShardRecoveryDataLogAhead (32.16s)
=== RUN   TestShardRecoveryDestroyBlksInDataLog
--- PASS: TestShardRecoveryDestroyBlksInDataLog (14.89s)
=== RUN   TestShardRecoveryDestroyBlksInRecoveryLog
--- PASS: TestShardRecoveryDestroyBlksInRecoveryLog (15.19s)
=== RUN   TestShardRecoveryDestroyBlksInBothLog
--- PASS: TestShardRecoveryDestroyBlksInBothLog (14.91s)
=== RUN   TestShardRecoveryRecoveryLogCorruption
--- PASS: TestShardRecoveryRecoveryLogCorruption (14.54s)
=== RUN   TestShardRecoveryRecoveryLogCorruptionServerless
--- PASS: TestShardRecoveryRecoveryLogCorruptionServerless (14.65s)
=== RUN   TestShardRecoveryDataLogCorruption
--- PASS: TestShardRecoveryDataLogCorruption (15.36s)
=== RUN   TestShardRecoveryDataLogCorruptionServerless
--- PASS: TestShardRecoveryDataLogCorruptionServerless (15.91s)
=== RUN   TestShardRecoverySharedNoRP
--- PASS: TestShardRecoverySharedNoRP (15.25s)
=== RUN   TestShardRecoveryNotEnoughMem
--- PASS: TestShardRecoveryNotEnoughMem (38.56s)
=== RUN   TestShardRecoveryCleanup
--- PASS: TestShardRecoveryCleanup (0.51s)
=== RUN   TestShardRecoveryRebuildSharedLog
--- PASS: TestShardRecoveryRebuildSharedLog (1.76s)
=== RUN   TestShardRecoveryUpgradeWithCheckpoint
--- PASS: TestShardRecoveryUpgradeWithCheckpoint (0.61s)
=== RUN   TestShardRecoveryUpgradeWithLogReplay
--- PASS: TestShardRecoveryUpgradeWithLogReplay (0.57s)
=== RUN   TestShardRecoveryRebuildAfterError
--- PASS: TestShardRecoveryRebuildAfterError (1.72s)
=== RUN   TestShardRecoveryRebuildAfterConcurrentDelete
--- PASS: TestShardRecoveryRebuildAfterConcurrentDelete (2.61s)
=== RUN   TestShardRecoveryAfterDeleteInstance
--- PASS: TestShardRecoveryAfterDeleteInstance (0.15s)
=== RUN   TestShardRecoveryDestroyShard
--- PASS: TestShardRecoveryDestroyShard (0.23s)
=== RUN   TestHeaderRepair
--- PASS: TestHeaderRepair (0.08s)
=== RUN   TestCheckpointWithWriter
--- PASS: TestCheckpointWithWriter (4.73s)
=== RUN   TestPlasmaRecoveryWithRepairFullReplay
--- PASS: TestPlasmaRecoveryWithRepairFullReplay (32.19s)
=== RUN   TestPlasmaRecoveryWithInsertRepairCheckpoint
--- PASS: TestPlasmaRecoveryWithInsertRepairCheckpoint (32.49s)
=== RUN   TestPlasmaRecoveryWithDeleteRepairCheckpoint
--- PASS: TestPlasmaRecoveryWithDeleteRepairCheckpoint (13.45s)
=== RUN   TestShardRecoverySharedFullReplayOnError
--- PASS: TestShardRecoverySharedFullReplayOnError (17.71s)
=== RUN   TestShardRecoverySharedFullReplayOnErrorServerless
--- PASS: TestShardRecoverySharedFullReplayOnErrorServerless (18.12s)
=== RUN   TestShardRecoveryDedicatedFullReplayOnError
--- PASS: TestShardRecoveryDedicatedFullReplayOnError (17.23s)
=== RUN   TestShardRecoveryDedicatedFullReplayOnErrorServerless
--- PASS: TestShardRecoveryDedicatedFullReplayOnErrorServerless (17.97s)
=== RUN   TestShardRecoverySharedFullReplayOnErrorWithRepair
--- PASS: TestShardRecoverySharedFullReplayOnErrorWithRepair (19.62s)
=== RUN   TestGlobalWorkContextForRecovery
--- PASS: TestGlobalWorkContextForRecovery (0.48s)
=== RUN   TestShardRecoveryPartialMetadata
--- PASS: TestShardRecoveryPartialMetadata (0.11s)
=== RUN   TestShardRecoveryPartialLSSMetadata
--- PASS: TestShardRecoveryPartialLSSMetadata (0.12s)
=== RUN   TestShardRecoveryPartialSkiplog
--- PASS: TestShardRecoveryPartialSkiplog (0.13s)
=== RUN   TestShardRecoveryTempSkiplog
--- PASS: TestShardRecoveryTempSkiplog (0.26s)
=== RUN   TestPageRemovalLargeKey
--- PASS: TestPageRemovalLargeKey (0.05s)
=== RUN   TestRpVersionOverflow
--- PASS: TestRpVersionOverflow (0.11s)
=== RUN   TestSkipLogSimple
--- PASS: TestSkipLogSimple (0.00s)
=== RUN   TestSkipLogLoadStore
--- PASS: TestSkipLogLoadStore (0.02s)
=== RUN   TestShardMetadata
--- PASS: TestShardMetadata (0.09s)
=== RUN   TestPlasmaId
--- PASS: TestPlasmaId (0.05s)
=== RUN   TestShardPersistence
--- PASS: TestShardPersistence (0.30s)
=== RUN   TestShardDestroy
--- PASS: TestShardDestroy (0.09s)
=== RUN   TestShardClose
--- PASS: TestShardClose (5.05s)
=== RUN   TestShardMgrRecovery
--- PASS: TestShardMgrRecovery (0.10s)
=== RUN   TestShardDeadData
--- PASS: TestShardDeadData (0.32s)
=== RUN   TestShardConfigUpdate
--- PASS: TestShardConfigUpdate (0.05s)
=== RUN   TestShardWriteAmp
--- PASS: TestShardWriteAmp (10.17s)
=== RUN   TestShardStats
--- PASS: TestShardStats (0.25s)
=== RUN   TestShardMultipleWriters
--- PASS: TestShardMultipleWriters (0.27s)
=== RUN   TestShardDestroyMultiple
--- PASS: TestShardDestroyMultiple (0.15s)
=== RUN   TestShardBackupCorrupted
--- PASS: TestShardBackupCorrupted (0.13s)
=== RUN   TestShardBackupCorruptedShare
--- PASS: TestShardBackupCorruptedShare (0.09s)
=== RUN   TestShardCorruption
--- PASS: TestShardCorruption (0.11s)
=== RUN   TestShardCorruptionAddInstance
--- PASS: TestShardCorruptionAddInstance (0.22s)
=== RUN   TestShardCreateError
--- PASS: TestShardCreateError (0.25s)
=== RUN   TestShardNumInsts
--- PASS: TestShardNumInsts (1.69s)
=== RUN   TestShardInstanceGroup
--- PASS: TestShardInstanceGroup (0.09s)
=== RUN   TestShardLeak
--- PASS: TestShardLeak (2.05s)
=== RUN   TestShardMemLeak
--- PASS: TestShardMemLeak (1.09s)
=== RUN   TestShardFind
--- PASS: TestShardFind (0.22s)
=== RUN   TestShardFileOpenDescCount
--- PASS: TestShardFileOpenDescCount (87.36s)
=== RUN   TestShardShutdownSharedLSS
--- PASS: TestShardShutdownSharedLSS (11.79s)
=== RUN   TestShardUUIDChange
--- PASS: TestShardUUIDChange (0.07s)
=== RUN   TestShardUUIDStable
--- PASS: TestShardUUIDStable (0.06s)
=== RUN   TestShardMetadataPersistenceOnPanic
--- PASS: TestShardMetadataPersistenceOnPanic (0.17s)
=== RUN   TestSMRSimple
--- PASS: TestSMRSimple (1.13s)
=== RUN   TestSMRConcurrent
--- PASS: TestSMRConcurrent (72.67s)
=== RUN   TestSMRComplex
--- PASS: TestSMRComplex (151.19s)
=== RUN   TestDGMWithCASConflicts
--- PASS: TestDGMWithCASConflicts (40.12s)
=== RUN   TestMaxSMRPendingMem
--- PASS: TestMaxSMRPendingMem (0.03s)
=== RUN   TestStatsLogger
--- PASS: TestStatsLogger (20.45s)
=== RUN   TestStatsSamplePercentile
--- PASS: TestStatsSamplePercentile (0.03s)
=== RUN   TestPlasmaSwapper
--- PASS: TestPlasmaSwapper (22.45s)
=== RUN   TestPlasmaAutoSwapper
--- PASS: TestPlasmaAutoSwapper (84.30s)
=== RUN   TestSwapperAddInstance
--- PASS: TestSwapperAddInstance (4.31s)
=== RUN   TestSwapperRemoveInstance
--- PASS: TestSwapperRemoveInstance (4.28s)
=== RUN   TestSwapperJoinContext
--- PASS: TestSwapperJoinContext (4.79s)
=== RUN   TestSwapperSplitContext
--- PASS: TestSwapperSplitContext (4.73s)
=== RUN   TestSwapperGlobalClock
--- PASS: TestSwapperGlobalClock (29.95s)
=== RUN   TestSwapperConflict
--- PASS: TestSwapperConflict (2.85s)
=== RUN   TestSwapperRemoveInstanceWait
--- PASS: TestSwapperRemoveInstanceWait (3.45s)
=== RUN   TestSwapperStats
--- PASS: TestSwapperStats (0.91s)
=== RUN   TestSwapperSweepInterval
--- PASS: TestSwapperSweepInterval (0.47s)
=== RUN   TestSweepCompress
--- PASS: TestSweepCompress (0.06s)
=== RUN   TestTenantShardAssignment
--- PASS: TestTenantShardAssignment (4.20s)
=== RUN   TestTenantShardAssignmentServerless
--- PASS: TestTenantShardAssignmentServerless (4.17s)
=== RUN   TestTenantShardAssignmentDedicated
--- PASS: TestTenantShardAssignmentDedicated (2.18s)
=== RUN   TestTenantShardAssignmentDedicatedMainBackIndexes
--- PASS: TestTenantShardAssignmentDedicatedMainBackIndexes (0.13s)
=== RUN   TestTenantShardRecovery
--- PASS: TestTenantShardRecovery (3.93s)
=== RUN   TestTenantMemUsed
--- PASS: TestTenantMemUsed (3.79s)
=== RUN   TestTenantSwitchController
--- PASS: TestTenantSwitchController (0.12s)
=== RUN   TestTenantBuildController
--- PASS: TestTenantBuildController (2.21s)
=== RUN   TestTenantControllerSwapperIncremental
--- PASS: TestTenantControllerSwapperIncremental (1.36s)
=== RUN   TestTenantControllerSwapperInitial
--- PASS: TestTenantControllerSwapperInitial (1.35s)
=== RUN   TestTenantControllerSwapperPeriodic
--- PASS: TestTenantControllerSwapperPeriodic (3.52s)
=== RUN   TestTenantSwapperZeroResident
--- PASS: TestTenantSwapperZeroResident (2.77s)
=== RUN   TestTenantSwapperZeroItem
--- PASS: TestTenantSwapperZeroItem (1.16s)
=== RUN   TestTenantAssignMandatoryQuota
--- PASS: TestTenantAssignMandatoryQuota (0.63s)
=== RUN   TestTenantMutationQuota
--- PASS: TestTenantMutationQuota (0.18s)
=== RUN   TestTenantMutationQuotaInitial
--- PASS: TestTenantMutationQuotaInitial (0.13s)
=== RUN   TestTenantMutationQuotaEmpty
--- PASS: TestTenantMutationQuotaEmpty (0.12s)
=== RUN   TestTenantMutationQuotaIncremental
--- PASS: TestTenantMutationQuotaIncremental (0.14s)
=== RUN   TestTenantCalibrateMutationQuota
--- PASS: TestTenantCalibrateMutationQuota (0.08s)
=== RUN   TestTenantMutationRate
--- PASS: TestTenantMutationRate (0.32s)
=== RUN   TestTenantMutationQuotaAdjustRate
--- PASS: TestTenantMutationQuotaAdjustRate (0.44s)
=== RUN   TestTenantBuildQuota
--- PASS: TestTenantBuildQuota (0.06s)
=== RUN   TestTenantInitialBuildQuota
--- PASS: TestTenantInitialBuildQuota (0.07s)
=== RUN   TestTenantInitialBuildNonDGM
--- PASS: TestTenantInitialBuildNonDGM (3.13s)
=== RUN   TestTenantInitialBuildDGM
--- PASS: TestTenantInitialBuildDGM (3.13s)
=== RUN   TestTenantInitialBuildZeroResident
--- PASS: TestTenantInitialBuildZeroResident (3.19s)
=== RUN   TestTenantIncrementalBuildDGM
--- PASS: TestTenantIncrementalBuildDGM (5.01s)
=== RUN   TestTenantInitialBuildTwoTenants
--- PASS: TestTenantInitialBuildTwoTenants (5.06s)
=== RUN   TestTenantInitialBuildTwoControllers
--- PASS: TestTenantInitialBuildTwoControllers (5.05s)
=== RUN   TestTenantIncrementalBuildTwoIndexes
--- PASS: TestTenantIncrementalBuildTwoIndexes (0.57s)
=== RUN   TestTenantIncrementalBuildConcurrent
--- PASS: TestTenantIncrementalBuildConcurrent (7.15s)
=== RUN   TestTenantDecrementGlobalQuota
--- PASS: TestTenantDecrementGlobalQuota (3.83s)
=== RUN   TestTenantInitialBuildNotEnoughQuota
--- PASS: TestTenantInitialBuildNotEnoughQuota (5.07s)
=== RUN   TestTenantRecoveryResidentRatioHeaderReplay
--- PASS: TestTenantRecoveryResidentRatioHeaderReplay (0.21s)
=== RUN   TestTenantRecoveryResidentRatioDataReplay
--- PASS: TestTenantRecoveryResidentRatioDataReplay (0.26s)
=== RUN   TestTenantRecoveryController
--- PASS: TestTenantRecoveryController (2.55s)
=== RUN   TestTenantRecoveryQuotaWithLastCheckpoint
--- PASS: TestTenantRecoveryQuotaWithLastCheckpoint (1.30s)
=== RUN   TestTenantRecoveryQuotaZeroResidentWithLastCheckpoint
--- PASS: TestTenantRecoveryQuotaZeroResidentWithLastCheckpoint (5.17s)
=== RUN   TestTenantRecoveryQuotaWithFormula
--- PASS: TestTenantRecoveryQuotaWithFormula (5.16s)
=== RUN   TestTenantRecoveryQuotaWithDataReplay
--- PASS: TestTenantRecoveryQuotaWithDataReplay (11.01s)
=== RUN   TestTenantRecoveryEvictionNoCheckpoint
--- PASS: TestTenantRecoveryEvictionNoCheckpoint (24.99s)
=== RUN   TestTenantRecoveryEvictionHeaderReplay
--- PASS: TestTenantRecoveryEvictionHeaderReplay (14.26s)
=== RUN   TestTenantRecoveryEvictionDataReplaySequential
--- PASS: TestTenantRecoveryEvictionDataReplaySequential (13.42s)
=== RUN   TestTenantRecoveryEvictionDataReplayInterleaved
--- PASS: TestTenantRecoveryEvictionDataReplayInterleaved (15.21s)
=== RUN   TestTenantRecoveryEvictionDataReplayNoCheckpoint
--- PASS: TestTenantRecoveryEvictionDataReplayNoCheckpoint (15.30s)
=== RUN   TestTenantRecoveryEvictionDataReplaySingle
--- PASS: TestTenantRecoveryEvictionDataReplaySingle (7.00s)
=== RUN   TestTenantRecoveryLastCheckpoint
--- PASS: TestTenantRecoveryLastCheckpoint (8.10s)
=== RUN   TestTenantRecoveryRequestQuota
--- PASS: TestTenantRecoveryRequestQuota (4.00s)
=== RUN   TestTenantWorkingSetPageHits
--- PASS: TestTenantWorkingSetPageHits (0.04s)
=== RUN   TestTenantWorkingSetConcurrent
--- PASS: TestTenantWorkingSetConcurrent (0.26s)
=== RUN   TestTenantDiscretionaryQuotaNoScan
--- PASS: TestTenantDiscretionaryQuotaNoScan (3.26s)
=== RUN   TestTenantDiscretionaryQuotaScan
--- PASS: TestTenantDiscretionaryQuotaScan (12.40s)
=== RUN   TestTenantAssignDiscretionaryQuota
--- PASS: TestTenantAssignDiscretionaryQuota (36.98s)
=== RUN   TestTenantMinimumQuota
--- PASS: TestTenantMinimumQuota (12.72s)
=== RUN   TestTenantWorkingSetIdle
--- PASS: TestTenantWorkingSetIdle (0.03s)
=== RUN   TestTenantWorkingSetSinceIdle
--- PASS: TestTenantWorkingSetSinceIdle (14.66s)
=== RUN   TestTenantWorkingSetIdleTime
--- PASS: TestTenantWorkingSetIdleTime (1.03s)
=== RUN   TestTenantAssignIdleQuota
--- PASS: TestTenantAssignIdleQuota (12.98s)
=== RUN   TestTenantAssignIdleQuotaEmptyIndex
--- PASS: TestTenantAssignIdleQuotaEmptyIndex (0.17s)
=== RUN   TestTenantActivateIdleTenant
--- PASS: TestTenantActivateIdleTenant (37.37s)
=== RUN   TestTenantActivateIdleTenantWriter
--- PASS: TestTenantActivateIdleTenantWriter (37.44s)
=== RUN   TestTenantActivateIdleTenantZeroQuota
--- PASS: TestTenantActivateIdleTenantZeroQuota (0.80s)
=== RUN   TestTenantStatsInStr
--- PASS: TestTenantStatsInStr (0.03s)
=== RUN   TestTenantMandatoryQuotaLowResident
--- PASS: TestTenantMandatoryQuotaLowResident (62.95s)
=== RUN   TestTenantMandatoryDeleteIndex
--- PASS: TestTenantMandatoryDeleteIndex (69.49s)
=== RUN   TestTenantMandatoryQuotaSmallIndex
--- PASS: TestTenantMandatoryQuotaSmallIndex (0.06s)
=== RUN   TestTenantNumShards
--- PASS: TestTenantNumShards (1.78s)
=== RUN   TestTenantNumShardsMixedConfig
--- PASS: TestTenantNumShardsMixedConfig (2.93s)
=== RUN   TestWriteRenameFileWithSync
--- PASS: TestWriteRenameFileWithSync (0.01s)
=== RUN   TestSCtx
--- PASS: TestSCtx (17.44s)
=== RUN   TestWCtxGeneric
--- PASS: TestWCtxGeneric (20.65s)
=== RUN   TestWCtxWriter
--- PASS: TestWCtxWriter (20.61s)
=== RUN   TestSCtxTrimWithReader
--- PASS: TestSCtxTrimWithReader (0.05s)
=== RUN   TestSCtxTrimWithWriter
--- PASS: TestSCtxTrimWithWriter (0.04s)
=== RUN   TestSCtxTrimEmpty
--- PASS: TestSCtxTrimEmpty (0.03s)
=== RUN   TestWCtxTrimWithReader
--- PASS: TestWCtxTrimWithReader (0.04s)
=== RUN   TestWCtxTrimWithWriter
--- PASS: TestWCtxTrimWithWriter (0.05s)
--- PASS: TestExtrasN1 (0.00s)
--- PASS: TestExtrasN3 (0.00s)
--- PASS: TestExtrasN2 (0.00s)
FAIL
FAIL	github.com/couchbase/plasma	4825.344s
=== RUN   TestInteger
--- PASS: TestInteger (0.00s)
=== RUN   TestSmallDecimal
--- PASS: TestSmallDecimal (0.00s)
=== RUN   TestLargeDecimal
--- PASS: TestLargeDecimal (0.00s)
=== RUN   TestFloat
--- PASS: TestFloat (0.00s)
=== RUN   TestSuffixCoding
--- PASS: TestSuffixCoding (0.00s)
=== RUN   TestCodecLength
--- PASS: TestCodecLength (0.00s)
=== RUN   TestSpecialString
--- PASS: TestSpecialString (0.00s)
=== RUN   TestCodecNoLength
--- PASS: TestCodecNoLength (0.00s)
=== RUN   TestCodecJSON
--- PASS: TestCodecJSON (0.00s)
=== RUN   TestReference
--- PASS: TestReference (0.00s)
=== RUN   TestN1QLEncode
--- PASS: TestN1QLEncode (0.00s)
=== RUN   TestArrayExplodeJoin
--- PASS: TestArrayExplodeJoin (0.00s)
=== RUN   TestN1QLDecode
--- PASS: TestN1QLDecode (0.00s)
=== RUN   TestN1QLDecode2
--- PASS: TestN1QLDecode2 (0.00s)
=== RUN   TestArrayExplodeJoin2
--- PASS: TestArrayExplodeJoin2 (0.00s)
=== RUN   TestMB28956
--- PASS: TestMB28956 (0.00s)
=== RUN   TestFixEncodedInt
--- PASS: TestFixEncodedInt (0.00s)
=== RUN   TestN1QLDecodeLargeInt64
--- PASS: TestN1QLDecodeLargeInt64 (0.00s)
=== RUN   TestMixedModeFixEncodedInt
TESTING [4111686018427387900, -8223372036854775808, 822337203685477618] 
PASS 
TESTING [0] 
PASS 
TESTING [0.0] 
PASS 
TESTING [0.0000] 
PASS 
TESTING [0.0000000] 
PASS 
TESTING [-0] 
PASS 
TESTING [-0.0] 
PASS 
TESTING [-0.0000] 
PASS 
TESTING [-0.0000000] 
PASS 
TESTING [1] 
PASS 
TESTING [20] 
PASS 
TESTING [3456] 
PASS 
TESTING [7645000] 
PASS 
TESTING [9223372036854775807] 
PASS 
TESTING [9223372036854775806] 
PASS 
TESTING [9223372036854775808] 
PASS 
TESTING [92233720368547758071234000] 
PASS 
TESTING [92233720368547758071234987437653] 
PASS 
TESTING [12300000000000000000000000000000056] 
PASS 
TESTING [12300000000000000000000000000000000] 
PASS 
TESTING [123000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000056] 
PASS 
TESTING [123000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000] 
PASS 
TESTING [12300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000056] 
PASS 
TESTING [210690] 
PASS 
TESTING [90000] 
PASS 
TESTING [123000000] 
PASS 
TESTING [3.60e2] 
PASS 
TESTING [36e2] 
PASS 
TESTING [1.9999999999e10] 
PASS 
TESTING [1.99999e10] 
PASS 
TESTING [1.99999e5] 
PASS 
TESTING [0.00000000000012e15] 
PASS 
TESTING [7.64507352e8] 
PASS 
TESTING [9.2233720368547758071234987437653e31] 
PASS 
TESTING [2650e-1] 
PASS 
TESTING [26500e-1] 
PASS 
TESTING [-1] 
PASS 
TESTING [-20] 
PASS 
TESTING [-3456] 
PASS 
TESTING [-7645000] 
PASS 
TESTING [-9223372036854775808] 
PASS 
TESTING [-9223372036854775807] 
PASS 
TESTING [-9223372036854775806] 
PASS 
TESTING [-9223372036854775809] 
PASS 
TESTING [-92233720368547758071234000] 
PASS 
TESTING [-92233720368547758071234987437653] 
PASS 
TESTING [-12300000000000000000000000000000056] 
PASS 
TESTING [-12300000000000000000000000000000000] 
PASS 
TESTING [-123000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000056] 
PASS 
TESTING [-123000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000] 
PASS 
TESTING [-210690] 
PASS 
TESTING [-90000] 
PASS 
TESTING [-123000000] 
PASS 
TESTING [-3.60e2] 
PASS 
TESTING [-36e2] 
PASS 
TESTING [-1.9999999999e10] 
PASS 
TESTING [-1.99999e10] 
PASS 
TESTING [-1.99999e5] 
PASS 
TESTING [-0.00000000000012e15] 
PASS 
TESTING [-2650e-1] 
PASS 
TESTING [-26500e-1] 
PASS 
TESTING [0.03] 
PASS 
TESTING [198.60] 
PASS 
TESTING [2000045.178] 
PASS 
TESTING [1.7976931348623157e+308] 
PASS 
TESTING [0.000000000000000000890] 
PASS 
TESTING [257953786.9864236576] 
PASS 
TESTING [257953786.9864236576e8] 
PASS 
TESTING [36.912e3] 
PASS 
TESTING [2761.67e0] 
PASS 
TESTING [2761.67e00] 
PASS 
TESTING [2761.67e000] 
PASS 
TESTING [7676546.67e-3] 
PASS 
TESTING [-0.03] 
PASS 
TESTING [-198.60] 
PASS 
TESTING [-2000045.178] 
PASS 
TESTING [-1.7976931348623157e+308] 
PASS 
TESTING [-0.000000000000000000890] 
PASS 
TESTING [-257953786.9864236576] 
PASS 
TESTING [-257953786.9864236576e8] 
PASS 
TESTING [-36.912e3] 
PASS 
TESTING [-2761.67e0] 
PASS 
TESTING [-2761.67e00] 
PASS 
TESTING [-2761.67e000] 
PASS 
TESTING [-7676546.67e-3] 
PASS 
--- PASS: TestMixedModeFixEncodedInt (0.01s)
=== RUN   TestCodecDesc
--- PASS: TestCodecDesc (0.00s)
=== RUN   TestCodecDescPropLen
--- PASS: TestCodecDescPropLen (0.00s)
=== RUN   TestCodecDescSplChar
--- PASS: TestCodecDescSplChar (0.00s)
PASS
ok  	github.com/couchbase/indexing/secondary/collatejson	0.034s
=== RUN   TestForestDBIterator
2023-02-16T17:56:05.654+05:30 [INFO][FDB] Forestdb blockcache size 134217728 initialized in 5517 us

2023-02-16T17:56:05.655+05:30 [INFO][FDB] Forestdb opened database file test
2023-02-16T17:56:05.658+05:30 [INFO][FDB] Forestdb closed database file test
--- PASS: TestForestDBIterator (0.01s)
=== RUN   TestForestDBIteratorSeek
2023-02-16T17:56:05.658+05:30 [INFO][FDB] Forestdb opened database file test
2023-02-16T17:56:05.660+05:30 [INFO][FDB] Forestdb closed database file test
--- PASS: TestForestDBIteratorSeek (0.00s)
=== RUN   TestPrimaryIndexEntry
--- PASS: TestPrimaryIndexEntry (0.00s)
=== RUN   TestSecondaryIndexEntry
--- PASS: TestSecondaryIndexEntry (0.00s)
=== RUN   TestPrimaryIndexEntryMatch
--- PASS: TestPrimaryIndexEntryMatch (0.00s)
=== RUN   TestSecondaryIndexEntryMatch
--- PASS: TestSecondaryIndexEntryMatch (0.00s)
=== RUN   TestLongDocIdEntry
--- PASS: TestLongDocIdEntry (0.00s)
=== RUN   TestMemDBInsertionPerf
Maximum number of file descriptors = 200000
Set IO Concurrency: 7200
Initial build: 10000000 items took 59.727370731s -> 167427.42695033367 items/s
Incr build: 10000000 items took 2m47.663217914s -> 59643.373927901885 items/s
Main Index: {
"node_count":             18000000,
"soft_deletes":           0,
"read_conflicts":         0,
"insert_conflicts":       3,
"next_pointers_per_node": 1.3333,
"memory_used":            1695887660,
"node_allocs":            18000000,
"node_frees":             0,
"level_node_distribution":{
"level0": 13500190,
"level1": 3375550,
"level2": 843073,
"level3": 210794,
"level4": 52735,
"level5": 13348,
"level6": 3245,
"level7": 782,
"level8": 220,
"level9": 42,
"level10": 17,
"level11": 3,
"level12": 1,
"level13": 0,
"level14": 0,
"level15": 0,
"level16": 0,
"level17": 0,
"level18": 0,
"level19": 0,
"level20": 0,
"level21": 0,
"level22": 0,
"level23": 0,
"level24": 0,
"level25": 0,
"level26": 0,
"level27": 0,
"level28": 0,
"level29": 0,
"level30": 0,
"level31": 0,
"level32": 0
}
}
Back Index 0 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 1 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 2 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 3 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 4 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 5 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 6 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 7 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 8 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 9 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 10 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 11 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 12 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 13 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 14 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
Back Index 15 : {
"FastHTCount":  625000,
"SlowHTCount":  0,
"Conflicts":   0,
"MemoryInUse": 26250000
}
--- PASS: TestMemDBInsertionPerf (227.39s)
=== RUN   TestBasicsA
--- PASS: TestBasicsA (0.00s)
=== RUN   TestSizeA
--- PASS: TestSizeA (0.00s)
=== RUN   TestSizeWithFreelistA
--- PASS: TestSizeWithFreelistA (0.00s)
=== RUN   TestDequeueUptoSeqnoA
--- PASS: TestDequeueUptoSeqnoA (0.10s)
=== RUN   TestDequeueA
--- PASS: TestDequeueA (1.21s)
=== RUN   TestMultipleVbucketsA
--- PASS: TestMultipleVbucketsA (0.00s)
=== RUN   TestDequeueUptoFreelistA
--- PASS: TestDequeueUptoFreelistA (0.00s)
=== RUN   TestDequeueUptoFreelistMultVbA
--- PASS: TestDequeueUptoFreelistMultVbA (0.00s)
=== RUN   TestConcurrentEnqueueDequeueA
--- PASS: TestConcurrentEnqueueDequeueA (0.00s)
=== RUN   TestConcurrentEnqueueDequeueA1
--- PASS: TestConcurrentEnqueueDequeueA1 (10.01s)
=== RUN   TestEnqueueAppCh
--- PASS: TestEnqueueAppCh (2.00s)
=== RUN   TestDequeueN
--- PASS: TestDequeueN (0.00s)
=== RUN   TestConcurrentEnqueueDequeueN
--- PASS: TestConcurrentEnqueueDequeueN (0.00s)
=== RUN   TestConcurrentEnqueueDequeueN1
--- PASS: TestConcurrentEnqueueDequeueN1 (10.01s)
PASS
ok  	github.com/couchbase/indexing/secondary/indexer	251.495s
=== RUN   TestConnPoolBasicSanity
2023-02-16T18:00:20.448+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 3 overflow 6 low WM 3 relConn batch size 1 ...
2023-02-16T18:00:20.655+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:00:21.448+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestConnPoolBasicSanity (5.00s)
=== RUN   TestConnRelease
2023-02-16T18:00:25.451+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 500 overflow 10 low WM 40 relConn batch size 10 ...
Waiting for connections to get released
Waiting for more connections to get released
Waiting for further more connections to get released
2023-02-16T18:01:05.209+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:01:05.468+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestConnRelease (43.76s)
=== RUN   TestLongevity
2023-02-16T18:01:09.211+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 500 overflow 10 low WM 40 relConn batch size 10 ...
Releasing 1 conns.
Getting 2 conns.
Releasing 2 conns.
Getting 4 conns.
Releasing 1 conns.
Getting 3 conns.
Releasing 0 conns.
Getting 0 conns.
Releasing 1 conns.
Getting 0 conns.
Releasing 4 conns.
Getting 1 conns.
Releasing 2 conns.
Getting 4 conns.
Releasing 3 conns.
Getting 4 conns.
Releasing 1 conns.
Getting 0 conns.
Releasing 2 conns.
Getting 1 conns.
Releasing 0 conns.
Getting 1 conns.
Releasing 3 conns.
Getting 3 conns.
Releasing 2 conns.
Getting 2 conns.
Releasing 2 conns.
Getting 3 conns.
Releasing 0 conns.
Getting 0 conns.
2023-02-16T18:01:47.657+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:01:48.229+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestLongevity (42.45s)
=== RUN   TestSustainedHighConns
2023-02-16T18:01:51.658+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 500 overflow 10 low WM 40 relConn batch size 10 ...
Allocating 16 Connections
cp.curActConns = 0
Returning 3 Connections
cp.curActConns = 11
Returning 2 Connections
cp.curActConns = 11
Allocating 6 Connections
Returning 4 Connections
cp.curActConns = 13
Returning 1 Connections
Allocating 12 Connections
cp.curActConns = 22
Returning 1 Connections
cp.curActConns = 23
Allocating 10 Connections
Returning 1 Connections
cp.curActConns = 32
Returning 3 Connections
Allocating 15 Connections
cp.curActConns = 33
Returning 4 Connections
cp.curActConns = 40
Returning 3 Connections
Allocating 8 Connections
cp.curActConns = 44
Returning 2 Connections
cp.curActConns = 43
Allocating 3 Connections
Returning 4 Connections
cp.curActConns = 42
Allocating 9 Connections
Returning 3 Connections
cp.curActConns = 48
Returning 2 Connections
Allocating 21 Connections
cp.curActConns = 55
Returning 4 Connections
cp.curActConns = 63
Returning 4 Connections
Allocating 0 Connections
cp.curActConns = 59
Returning 0 Connections
Allocating 13 Connections
cp.curActConns = 68
Returning 3 Connections
cp.curActConns = 69
Allocating 3 Connections
Returning 0 Connections
cp.curActConns = 72
Returning 1 Connections
Allocating 10 Connections
cp.curActConns = 80
Returning 0 Connections
cp.curActConns = 81
Allocating 6 Connections
Returning 1 Connections
cp.curActConns = 86
Returning 3 Connections
Allocating 11 Connections
cp.curActConns = 92
Returning 2 Connections
cp.curActConns = 92
Allocating 8 Connections
Returning 1 Connections
cp.curActConns = 99
Returning 3 Connections
Allocating 1 Connections
cp.curActConns = 97
Returning 2 Connections
Allocating 18 Connections
cp.curActConns = 104
Returning 2 Connections
cp.curActConns = 111
Returning 4 Connections
Allocating 2 Connections
cp.curActConns = 109
Returning 3 Connections
Allocating 21 Connections
cp.curActConns = 116
Returning 0 Connections
cp.curActConns = 127
Returning 3 Connections
Allocating 3 Connections
cp.curActConns = 127
Returning 2 Connections
Allocating 8 Connections
cp.curActConns = 128
Returning 1 Connections
cp.curActConns = 132
Returning 4 Connections
Allocating 8 Connections
cp.curActConns = 136
Returning 3 Connections
Allocating 16 Connections
cp.curActConns = 139
Returning 2 Connections
cp.curActConns = 147
Returning 3 Connections
Allocating 11 Connections
cp.curActConns = 150
Returning 1 Connections
cp.curActConns = 154
Returning 2 Connections
Allocating 15 Connections
cp.curActConns = 161
Returning 3 Connections
cp.curActConns = 164
Returning 2 Connections
Allocating 18 Connections
cp.curActConns = 172
Returning 0 Connections
cp.curActConns = 180
Returning 3 Connections
Allocating 0 Connections
cp.curActConns = 177
Returning 1 Connections
Allocating 15 Connections
cp.curActConns = 184
Returning 2 Connections
cp.curActConns = 189
Returning 2 Connections
Allocating 10 Connections
cp.curActConns = 195
Returning 0 Connections
cp.curActConns = 197
Returning 0 Connections
Allocating 15 Connections
cp.curActConns = 207
Returning 2 Connections
cp.curActConns = 210
Returning 3 Connections
Allocating 3 Connections
cp.curActConns = 210
Returning 1 Connections
Allocating 2 Connections
cp.curActConns = 211
Returning 4 Connections
Allocating 2 Connections
cp.curActConns = 209
Returning 2 Connections
Allocating 2 Connections
cp.curActConns = 209
Returning 1 Connections
Allocating 4 Connections
cp.curActConns = 212
Returning 1 Connections
Allocating 2 Connections
cp.curActConns = 213
Returning 1 Connections
Allocating 4 Connections
cp.curActConns = 216
Returning 1 Connections
Allocating 0 Connections
cp.curActConns = 215
Returning 3 Connections
Allocating 1 Connections
cp.curActConns = 213
Returning 4 Connections
Allocating 1 Connections
cp.curActConns = 210
Returning 2 Connections
Allocating 0 Connections
cp.curActConns = 208
Returning 0 Connections
Allocating 0 Connections
cp.curActConns = 208
Returning 2 Connections
Allocating 4 Connections
cp.curActConns = 210
Returning 3 Connections
Allocating 3 Connections
cp.curActConns = 210
Returning 4 Connections
Allocating 3 Connections
cp.curActConns = 209
Returning 4 Connections
Allocating 2 Connections
cp.curActConns = 207
Returning 2 Connections
Allocating 2 Connections
cp.curActConns = 207
Returning 1 Connections
Allocating 4 Connections
cp.curActConns = 210
Returning 0 Connections
Allocating 1 Connections
cp.curActConns = 211
Returning 1 Connections
Allocating 1 Connections
cp.curActConns = 211
Returning 0 Connections
Allocating 1 Connections
cp.curActConns = 212
Returning 4 Connections
Allocating 4 Connections
cp.curActConns = 212
Returning 0 Connections
Allocating 3 Connections
cp.curActConns = 215
Returning 1 Connections
Allocating 0 Connections
cp.curActConns = 214
Returning 2 Connections
Allocating 3 Connections
cp.curActConns = 215
Returning 2 Connections
Allocating 1 Connections
cp.curActConns = 214
Returning 3 Connections
Allocating 4 Connections
cp.curActConns = 215
Returning 2 Connections
Allocating 3 Connections
cp.curActConns = 216
Returning 2 Connections
Allocating 3 Connections
cp.curActConns = 217
Returning 3 Connections
Allocating 3 Connections
cp.curActConns = 217
Returning 1 Connections
Allocating 0 Connections
cp.curActConns = 216
Returning 0 Connections
Allocating 0 Connections
cp.curActConns = 216
Returning 4 Connections
Allocating 2 Connections
cp.curActConns = 214
Returning 2 Connections
Allocating 0 Connections
cp.curActConns = 212
Returning 1 Connections
Allocating 4 Connections
cp.curActConns = 215
Returning 0 Connections
Allocating 3 Connections
cp.curActConns = 218
Returning 0 Connections
Allocating 4 Connections
cp.curActConns = 218
Returning 0 Connections
cp.curActConns = 222
Returning 3 Connections
Allocating 0 Connections
cp.curActConns = 219
Returning 4 Connections
Allocating 2 Connections
cp.curActConns = 217
Returning 0 Connections
Allocating 0 Connections
cp.curActConns = 217
Returning 2 Connections
Allocating 4 Connections
cp.curActConns = 219
Returning 0 Connections
Allocating 3 Connections
cp.curActConns = 222
Returning 1 Connections
Allocating 0 Connections
cp.curActConns = 221
Returning 1 Connections
Allocating 0 Connections
cp.curActConns = 220
Returning 1 Connections
Allocating 4 Connections
cp.curActConns = 223
Returning 2 Connections
Allocating 1 Connections
cp.curActConns = 222
Returning 2 Connections
Allocating 1 Connections
cp.curActConns = 221
Returning 0 Connections
Allocating 4 Connections
cp.curActConns = 225
Returning 2 Connections
Allocating 2 Connections
cp.curActConns = 225
Retuning from startDeallocatorRoutine
Retuning from startAllocatorRoutine
2023-02-16T18:02:46.726+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:02:47.692+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestSustainedHighConns (59.07s)
=== RUN   TestLowWM
2023-02-16T18:02:50.727+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 20 overflow 5 low WM 10 relConn batch size 2 ...
2023-02-16T18:03:50.744+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] active conns 0, free conns 10
2023-02-16T18:04:50.761+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] active conns 0, free conns 10
2023-02-16T18:04:56.239+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:04:56.763+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestLowWM (129.51s)
=== RUN   TestTotalConns
2023-02-16T18:05:00.241+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 120 overflow 5 low WM 10 relConn batch size 10 ...
2023-02-16T18:05:14.444+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:05:15.248+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestTotalConns (18.20s)
=== RUN   TestUpdateTickRate
2023-02-16T18:05:18.445+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] started poolsize 40 overflow 5 low WM 2 relConn batch size 2 ...
2023-02-16T18:05:39.295+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] ... stopped
2023-02-16T18:05:39.455+05:30 [Info] [Queryport-connpool:127.0.0.1:15151] Stopping releaseConnsRoutine
--- PASS: TestUpdateTickRate (24.85s)
PASS
ok  	github.com/couchbase/indexing/secondary/queryport/client	322.903s
Starting server: attempt 1

Functional tests

2023/02/16 18:07:52 In TestMain()
2023/02/16 18:07:52 otp node fetch error: json: cannot unmarshal string into Go value of type couchbase.Pool
2023/02/16 18:07:52 Initialising services with role: kv,n1ql on node: 127.0.0.1:9000
2023/02/16 18:07:53 Initialising web UI on node: 127.0.0.1:9000
2023/02/16 18:07:53 InitWebCreds, response is: {"newBaseUri":"http://127.0.0.1:9000/"}
2023/02/16 18:07:54 Setting data quota of 1500M and Index quota of 1500M
2023/02/16 18:07:55 Adding node: https://127.0.0.1:19001 with role: kv,index to the cluster
2023/02/16 18:08:03 AddNode: Successfully added node: 127.0.0.1:9001 (role kv,index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 18:08:08 Rebalance progress: 0
2023/02/16 18:08:13 Rebalance progress: 0
2023/02/16 18:08:18 Rebalance progress: 100
2023/02/16 18:08:23 Created bucket default, responseBody: 
2023/02/16 18:08:28 Cluster status: map[127.0.0.1:9001:[index kv] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 18:08:28 Successfully initialised cluster
2023/02/16 18:08:28 Cluster status: map[127.0.0.1:9001:[index kv] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 18:08:28 Changing config key queryport.client.settings.backfillLimit to value 0
2023/02/16 18:08:29 Changing config key queryport.client.log_level to value Warn
2023/02/16 18:08:29 Changing config key indexer.api.enableTestServer to value true
2023/02/16 18:08:29 Changing config key indexer.settings.persisted_snapshot_init_build.moi.interval to value 60000
2023/02/16 18:08:29 Changing config key indexer.settings.persisted_snapshot.moi.interval to value 60000
2023/02/16 18:08:29 Changing config key indexer.settings.log_level to value info
2023/02/16 18:08:29 Changing config key indexer.settings.storage_mode.disable_upgrade to value true
2023/02/16 18:08:29 Using plasma for creating indexes
2023/02/16 18:08:29 Changing config key indexer.settings.storage_mode to value plasma
2023/02/16 18:08:34 Data file exists. Skipping download
2023/02/16 18:08:34 Data file exists. Skipping download
2023/02/16 18:08:36 In DropAllSecondaryIndexes()
2023/02/16 18:08:36 Emptying the default bucket
2023/02/16 18:08:39 Flush Enabled on bucket default, responseBody: 
2023/02/16 18:09:17 Flushed the bucket default, Response body: 
2023/02/16 18:09:17 Create Index On the empty default Bucket()
2023/02/16 18:09:20 Created the secondary index index_eyeColor. Waiting for it become active
2023/02/16 18:09:20 Index is 1046578615528683010 now active
2023/02/16 18:09:20 Populating the default bucket
=== RUN   TestScanAfterBucketPopulate
2023/02/16 18:09:30 In TestScanAfterBucketPopulate()
2023/02/16 18:09:30 Create an index on empty bucket, populate the bucket and Run a scan on the index
2023/02/16 18:09:30 Using n1ql client
2023-02-16T18:09:30.279+05:30 [Info] creating GsiClient for 127.0.0.1:9000
2023/02/16 18:09:30 Expected and Actual scan responses are the same
--- PASS: TestScanAfterBucketPopulate (0.12s)
=== RUN   TestRestartNilSnapshot
2023/02/16 18:09:30 In TestRestartNilSnapshot()
2023/02/16 18:09:34 Created the secondary index idx_age. Waiting for it become active
2023/02/16 18:09:34 Index is 6297773815257594590 now active
2023/02/16 18:09:34 Restarting indexer process ...
2023/02/16 18:09:34 []
2023-02-16T18:09:34.815+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T18:09:34.816+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 18:12:54 Using n1ql client
2023-02-16T18:12:54.765+05:30 [Error] transport error between 127.0.0.1:56038->127.0.0.1:9107: write tcp 127.0.0.1:56038->127.0.0.1:9107: write: broken pipe
2023-02-16T18:12:54.765+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 4455141576836628867 request transport failed `write tcp 127.0.0.1:56038->127.0.0.1:9107: write: broken pipe`
2023-02-16T18:12:54.765+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T18:12:54.765+05:30 [Error] metadataClient:PickRandom: Replicas - [10422158447916394500], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 18:12:54 Expected and Actual scan responses are the same
--- PASS: TestRestartNilSnapshot (204.44s)
=== RUN   TestThreeIndexCreates
2023/02/16 18:12:54 In TestThreeIndexCreates()
2023/02/16 18:12:59 Created the secondary index index_balance. Waiting for it become active
2023/02/16 18:12:59 Index is 15161702877138498880 now active
2023/02/16 18:12:59 Create docs mutations
2023/02/16 18:12:59 Using n1ql client
2023/02/16 18:12:59 Expected and Actual scan responses are the same
2023/02/16 18:13:05 Created the secondary index index_email. Waiting for it become active
2023/02/16 18:13:05 Index is 207814294162102803 now active
2023/02/16 18:13:05 Create docs mutations
2023/02/16 18:13:05 Using n1ql client
2023/02/16 18:13:05 Expected and Actual scan responses are the same
2023/02/16 18:13:12 Created the secondary index index_pin. Waiting for it become active
2023/02/16 18:13:12 Index is 5725540221063228564 now active
2023/02/16 18:13:12 Delete docs mutations
2023/02/16 18:13:12 Using n1ql client
2023/02/16 18:13:12 Expected and Actual scan responses are the same
--- PASS: TestThreeIndexCreates (17.73s)
=== RUN   TestMultipleIndexCreatesDropsWithMutations
2023/02/16 18:13:12 In TestThreeIndexCreates()
2023/02/16 18:13:18 Created the secondary index index_state. Waiting for it become active
2023/02/16 18:13:18 Index is 6995605453167308034 now active
2023/02/16 18:13:18 Create docs mutations
2023/02/16 18:13:19 Using n1ql client
2023/02/16 18:13:19 Expected and Actual scan responses are the same
2023/02/16 18:13:25 Created the secondary index index_registered. Waiting for it become active
2023/02/16 18:13:25 Index is 10241685915932492372 now active
2023/02/16 18:13:25 Create docs mutations
2023/02/16 18:13:25 Using n1ql client
2023/02/16 18:13:25 Expected and Actual scan responses are the same
2023/02/16 18:13:32 Created the secondary index index_gender. Waiting for it become active
2023/02/16 18:13:32 Index is 16740037183678644288 now active
2023/02/16 18:13:32 Create docs mutations
2023/02/16 18:13:32 Using n1ql client
2023/02/16 18:13:32 Expected and Actual scan responses are the same
2023/02/16 18:13:32 Dropping the secondary index index_registered
2023/02/16 18:13:32 Index dropped
2023/02/16 18:13:32 Create docs mutations
2023/02/16 18:13:32 Delete docs mutations
2023/02/16 18:13:32 Using n1ql client
2023/02/16 18:13:32 Expected and Actual scan responses are the same
2023/02/16 18:13:38 Created the secondary index index_longitude. Waiting for it become active
2023/02/16 18:13:38 Index is 6561335076729091033 now active
2023/02/16 18:13:38 Create docs mutations
2023/02/16 18:13:38 Using n1ql client
2023/02/16 18:13:38 Expected and Actual scan responses are the same
--- PASS: TestMultipleIndexCreatesDropsWithMutations (26.09s)
=== RUN   TestCreateDropScan
2023/02/16 18:13:38 In TestCreateDropScan()
2023/02/16 18:13:44 Created the secondary index index_cd. Waiting for it become active
2023/02/16 18:13:44 Index is 8115096129840759803 now active
2023/02/16 18:13:44 Using n1ql client
2023/02/16 18:13:44 Expected and Actual scan responses are the same
2023/02/16 18:13:44 Dropping the secondary index index_cd
2023/02/16 18:13:45 Index dropped
2023/02/16 18:13:45 Using n1ql client
2023/02/16 18:13:45 Scan failed as expected with error: Index Not Found - cause: GSI index index_cd not found.
--- PASS: TestCreateDropScan (6.39s)
=== RUN   TestCreateDropCreate
2023/02/16 18:13:45 In TestCreateDropCreate()
2023/02/16 18:13:51 Created the secondary index index_cdc. Waiting for it become active
2023/02/16 18:13:51 Index is 4576566473654297478 now active
2023/02/16 18:13:51 Using n1ql client
2023/02/16 18:13:51 Expected and Actual scan responses are the same
2023/02/16 18:13:51 Dropping the secondary index index_cdc
2023/02/16 18:13:51 Index dropped
2023/02/16 18:13:51 Using n1ql client
2023/02/16 18:13:51 Scan 2 failed as expected with error: Index Not Found - cause: GSI index index_cdc not found.
2023/02/16 18:13:57 Created the secondary index index_cdc. Waiting for it become active
2023/02/16 18:13:57 Index is 12142363860036702469 now active
2023/02/16 18:13:57 Using n1ql client
2023/02/16 18:13:57 Expected and Actual scan responses are the same
2023/02/16 18:13:57 (Inclusion 1) Lengths of expected and actual scan results are 5053 and 5053. Num of docs in bucket = 10402
2023/02/16 18:13:57 Using n1ql client
2023/02/16 18:13:57 Expected and Actual scan responses are the same
2023/02/16 18:13:57 (Inclusion 3) Lengths of expected and actual scan results are 5053 and 5053. Num of docs in bucket = 10402
--- PASS: TestCreateDropCreate (12.78s)
=== RUN   TestCreate2Drop1Scan2
2023/02/16 18:13:57 In TestCreate2Drop1Scan2()
2023/02/16 18:14:04 Created the secondary index index_i1. Waiting for it become active
2023/02/16 18:14:04 Index is 155651049354373606 now active
2023/02/16 18:14:10 Created the secondary index index_i2. Waiting for it become active
2023/02/16 18:14:10 Index is 569203118087886275 now active
2023/02/16 18:14:10 Using n1ql client
2023/02/16 18:14:10 Expected and Actual scan responses are the same
2023/02/16 18:14:10 Using n1ql client
2023/02/16 18:14:10 Expected and Actual scan responses are the same
2023/02/16 18:14:10 Dropping the secondary index index_i1
2023/02/16 18:14:10 Index dropped
2023/02/16 18:14:10 Using n1ql client
2023/02/16 18:14:10 Expected and Actual scan responses are the same
--- PASS: TestCreate2Drop1Scan2 (12.75s)
=== RUN   TestIndexNameCaseSensitivity
2023/02/16 18:14:10 In TestIndexNameCaseSensitivity()
2023/02/16 18:14:16 Created the secondary index index_age. Waiting for it become active
2023/02/16 18:14:16 Index is 4819862522157875049 now active
2023/02/16 18:14:16 Using n1ql client
2023/02/16 18:14:16 Expected and Actual scan responses are the same
2023/02/16 18:14:16 Using n1ql client
2023/02/16 18:14:16 Scan failed as expected with error: Index Not Found - cause: GSI index index_Age not found.
--- PASS: TestIndexNameCaseSensitivity (6.24s)
=== RUN   TestCreateDuplicateIndex
2023/02/16 18:14:16 In TestCreateDuplicateIndex()
2023/02/16 18:14:22 Created the secondary index index_di1. Waiting for it become active
2023/02/16 18:14:22 Index is 13299911482769867697 now active
2023/02/16 18:14:22 Index found:  index_di1
2023/02/16 18:14:22 Create failed as expected with error: Index index_di1 already exists.
--- PASS: TestCreateDuplicateIndex (6.14s)
=== RUN   TestDropNonExistingIndex
2023/02/16 18:14:22 In TestDropNonExistingIndex()
2023/02/16 18:14:22 Dropping the secondary index 123456
2023/02/16 18:14:22 Index drop failed as expected with error: Index does not exist.
--- PASS: TestDropNonExistingIndex (0.03s)
=== RUN   TestCreateIndexNonExistentBucket
2023/02/16 18:14:22 In TestCreateIndexNonExistentBucket()
2023-02-16T18:14:23.473+05:30 [Error] Encountered error during create index.  Error: [Bucket Not Found] Bucket BlahBucket does not exist or temporarily unavailable for creating new index. Please retry the operation at a later time.
2023-02-16T18:14:34.476+05:30 [Error] Fail to create index: [Bucket Not Found] Bucket BlahBucket does not exist or temporarily unavailable for creating new index. Please retry the operation at a later time.
2023/02/16 18:14:34 Index create failed as expected with error: [Bucket Not Found] Bucket BlahBucket does not exist or temporarily unavailable for creating new index. Please retry the operation at a later time.
--- PASS: TestCreateIndexNonExistentBucket (11.51s)
=== RUN   TestScanWithNoTimeout
2023/02/16 18:14:34 Create an index on empty bucket, populate the bucket and Run a scan on the index
2023/02/16 18:14:34 Changing config key indexer.settings.scan_timeout to value 0
2023/02/16 18:14:34 Using n1ql client
2023/02/16 18:14:34 Expected and Actual scan responses are the same
--- PASS: TestScanWithNoTimeout (0.41s)
=== RUN   TestIndexingOnBinaryBucketMeta
2023/02/16 18:14:34 In TestIndexingOnBinaryBucketMeta()
2023/02/16 18:14:34 	 1. Populate a bucekt with binary docs and create indexs on the `id`, `cas` and `expiration` fields of Metadata
2023/02/16 18:14:34 	 2. Validate the test by comparing the items_count of indexes and the number of docs in the bucket for each of the fields
2023/02/16 18:14:37 Modified parameters of bucket default, responseBody: 
2023/02/16 18:14:38 Created bucket binaryBucket, responseBody: 
2023/02/16 18:14:56 Created the secondary index index_binary_meta_id. Waiting for it become active
2023/02/16 18:14:56 Index is 17315088689056113958 now active
2023/02/16 18:15:01 items_count stat is 10 for index index_binary_meta_id
2023/02/16 18:15:01 Dropping the secondary index index_binary_meta_id
2023/02/16 18:15:01 Index dropped
2023/02/16 18:15:04 Created the secondary index index_binary_meta_cas. Waiting for it become active
2023/02/16 18:15:04 Index is 9677881803455870942 now active
2023/02/16 18:15:09 items_count stat is 10 for index index_binary_meta_cas
2023/02/16 18:15:09 Dropping the secondary index index_binary_meta_cas
2023/02/16 18:15:09 Index dropped
2023/02/16 18:15:12 Created the secondary index index_binary_meta_expiration. Waiting for it become active
2023/02/16 18:15:12 Index is 7147667604309924914 now active
2023/02/16 18:15:18 items_count stat is 10 for index index_binary_meta_expiration
2023/02/16 18:15:18 Dropping the secondary index index_binary_meta_expiration
2023/02/16 18:15:18 Index dropped
2023/02/16 18:15:19 Deleted bucket binaryBucket, responseBody: 
2023/02/16 18:15:23 Modified parameters of bucket default, responseBody: 
--- PASS: TestIndexingOnBinaryBucketMeta (63.15s)
=== RUN   TestRetainDeleteXATTRBinaryDocs
2023/02/16 18:15:38 In TestRetainDeleteXATTRBinaryDocs()
2023/02/16 18:15:38 	 1. Populate a bucket with binary docs having system XATTRS
2023/02/16 18:15:38 	 2. Create index on the system XATTRS with "retain_deleted_xattr" attribute set to true
2023/02/16 18:15:38 	 3. Delete the documents in the bucket
2023/02/16 18:15:38 	 4. Query for the meta() information in the source bucket. The total number of results should be equivalent to the number of documents in the bucket before deletion of documents
2023/02/16 18:15:41 Modified parameters of bucket default, responseBody: 
2023/02/16 18:15:41 Created bucket binaryBucket, responseBody: 
2023/02/16 18:15:59 Created the secondary index index_system_xattr. Waiting for it become active
2023/02/16 18:15:59 Index is 9767990421293842844 now active
2023/02/16 18:16:04 Deleted all the documents in the bucket: binaryBucket successfully
2023/02/16 18:16:07 Deleted bucket binaryBucket, responseBody: 
2023/02/16 18:16:10 Modified parameters of bucket default, responseBody: 
--- PASS: TestRetainDeleteXATTRBinaryDocs (47.59s)
=== RUN   TestIndexingOnXATTRs
2023/02/16 18:16:25 In TestIndexingOnXATTRs()
2023/02/16 18:16:28 Modified parameters of bucket default, responseBody: 
2023/02/16 18:16:28 Created bucket bucket_xattrs, responseBody: 
2023/02/16 18:16:48 Created the secondary index index_sync_rev. Waiting for it become active
2023/02/16 18:16:48 Index is 14267050883425218516 now active
2023/02/16 18:16:54 Created the secondary index index_sync_channels. Waiting for it become active
2023/02/16 18:16:54 Index is 6545600653097758156 now active
2023/02/16 18:17:00 Created the secondary index index_sync_sequence. Waiting for it become active
2023/02/16 18:17:00 Index is 1270319663515948427 now active
2023/02/16 18:17:05 items_count stat is 100 for index index_sync_rev
2023/02/16 18:17:05 items_count stat is 100 for index index_sync_channels
2023/02/16 18:17:06 items_count stat is 100 for index index_sync_sequence
2023/02/16 18:17:06 Using n1ql client
2023-02-16T18:17:06.028+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:17:06.031+05:30 [Info] GSIC[default/bucket_xattrs-_default-_default-1676551626026629245] started ...
2023/02/16 18:17:06 Dropping the secondary index index_sync_rev
2023/02/16 18:17:06 Index dropped
2023/02/16 18:17:06 Using n1ql client
2023/02/16 18:17:06 Dropping the secondary index index_sync_channels
2023/02/16 18:17:06 Index dropped
2023/02/16 18:17:06 Using n1ql client
2023/02/16 18:17:06 Dropping the secondary index index_sync_sequence
2023/02/16 18:17:06 Index dropped
2023/02/16 18:17:08 Deleted bucket bucket_xattrs, responseBody: 
2023/02/16 18:17:11 Modified parameters of bucket default, responseBody: 
--- PASS: TestIndexingOnXATTRs (61.03s)
=== RUN   TestSimpleIndex_FloatDataType
2023/02/16 18:17:26 In TestSimpleIndex_FloatDataType()
2023/02/16 18:17:26 Index found:  index_age
2023/02/16 18:17:26 Using n1ql client
2023/02/16 18:17:26 Expected and Actual scan responses are the same
--- PASS: TestSimpleIndex_FloatDataType (0.03s)
=== RUN   TestSimpleIndex_StringDataType
2023/02/16 18:17:26 In TestSimpleIndex_StringDataType()
2023/02/16 18:17:31 Created the secondary index index_company. Waiting for it become active
2023/02/16 18:17:31 Index is 11860568751140630795 now active
2023/02/16 18:17:31 Using n1ql client
2023/02/16 18:17:31 Expected and Actual scan responses are the same
2023/02/16 18:17:31 Using n1ql client
2023/02/16 18:17:31 Expected and Actual scan responses are the same
--- PASS: TestSimpleIndex_StringDataType (4.73s)
=== RUN   TestSimpleIndex_FieldValueCaseSensitivity
2023/02/16 18:17:31 In TestSimpleIndex_StringCaseSensitivity()
2023/02/16 18:17:31 Index found:  index_company
2023/02/16 18:17:31 Using n1ql client
2023/02/16 18:17:31 Expected and Actual scan responses are the same
2023/02/16 18:17:31 Using n1ql client
2023/02/16 18:17:31 Expected and Actual scan responses are the same
--- PASS: TestSimpleIndex_FieldValueCaseSensitivity (0.07s)
=== RUN   TestSimpleIndex_BoolDataType
2023/02/16 18:17:31 In TestSimpleIndex_BoolDataType()
2023/02/16 18:17:38 Created the secondary index index_isActive. Waiting for it become active
2023/02/16 18:17:38 Index is 17186397220299497706 now active
2023/02/16 18:17:38 Using n1ql client
2023/02/16 18:17:38 Expected and Actual scan responses are the same
--- PASS: TestSimpleIndex_BoolDataType (6.64s)
=== RUN   TestBasicLookup
2023/02/16 18:17:38 In TestBasicLookup()
2023/02/16 18:17:38 Index found:  index_company
2023/02/16 18:17:38 Using n1ql client
2023/02/16 18:17:38 Expected and Actual scan responses are the same
--- PASS: TestBasicLookup (0.01s)
=== RUN   TestIndexOnNonExistentField
2023/02/16 18:17:38 In TestIndexOnNonExistentField()
2023/02/16 18:17:44 Created the secondary index index_height. Waiting for it become active
2023/02/16 18:17:44 Index is 4720555745349058893 now active
2023/02/16 18:17:44 Using n1ql client
2023/02/16 18:17:44 Expected and Actual scan responses are the same
--- PASS: TestIndexOnNonExistentField (6.45s)
=== RUN   TestIndexPartiallyMissingField
2023/02/16 18:17:44 In TestIndexPartiallyMissingField()
2023/02/16 18:17:51 Created the secondary index index_nationality. Waiting for it become active
2023/02/16 18:17:51 Index is 9045456005941863717 now active
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestIndexPartiallyMissingField (6.49s)
=== RUN   TestScanNonMatchingDatatype
2023/02/16 18:17:51 In TestScanNonMatchingDatatype()
2023/02/16 18:17:51 Index found:  index_age
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestScanNonMatchingDatatype (0.02s)
=== RUN   TestInclusionNeither
2023/02/16 18:17:51 In TestInclusionNeither()
2023/02/16 18:17:51 Index found:  index_age
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestInclusionNeither (0.04s)
=== RUN   TestInclusionLow
2023/02/16 18:17:51 In TestInclusionLow()
2023/02/16 18:17:51 Index found:  index_age
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestInclusionLow (0.02s)
=== RUN   TestInclusionHigh
2023/02/16 18:17:51 In TestInclusionHigh()
2023/02/16 18:17:51 Index found:  index_age
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestInclusionHigh (0.02s)
=== RUN   TestInclusionBoth
2023/02/16 18:17:51 In TestInclusionBoth()
2023/02/16 18:17:51 Index found:  index_age
2023/02/16 18:17:51 Using n1ql client
2023/02/16 18:17:51 Expected and Actual scan responses are the same
--- PASS: TestInclusionBoth (0.02s)
=== RUN   TestNestedIndex_String
2023/02/16 18:17:51 In TestNestedIndex_String()
2023/02/16 18:17:57 Created the secondary index index_streetname. Waiting for it become active
2023/02/16 18:17:57 Index is 11893071861118913758 now active
2023/02/16 18:17:57 Using n1ql client
2023/02/16 18:17:57 Expected and Actual scan responses are the same
--- PASS: TestNestedIndex_String (6.42s)
=== RUN   TestNestedIndex_Float
2023/02/16 18:17:57 In TestNestedIndex_Float()
2023/02/16 18:18:04 Created the secondary index index_floor. Waiting for it become active
2023/02/16 18:18:04 Index is 149881028614819382 now active
2023/02/16 18:18:04 Using n1ql client
2023/02/16 18:18:04 Expected and Actual scan responses are the same
--- PASS: TestNestedIndex_Float (6.59s)
=== RUN   TestNestedIndex_Bool
2023/02/16 18:18:04 In TestNestedIndex_Bool()
2023/02/16 18:18:10 Created the secondary index index_isresidential. Waiting for it become active
2023/02/16 18:18:10 Index is 17716603712861224217 now active
2023/02/16 18:18:10 Using n1ql client
2023/02/16 18:18:10 Expected and Actual scan responses are the same
--- PASS: TestNestedIndex_Bool (6.47s)
=== RUN   TestLookupJsonObject
2023/02/16 18:18:10 In TestLookupJsonObject()
2023/02/16 18:18:17 Created the secondary index index_streetaddress. Waiting for it become active
2023/02/16 18:18:17 Index is 16152323106017866294 now active
2023/02/16 18:18:17 Using n1ql client
2023/02/16 18:18:17 Count of docScanResults is 1
2023/02/16 18:18:17 Key: User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: [map[buildingname:Sterling Heights doornumber:12B floor:5 streetname:Hill Street]]
2023/02/16 18:18:17 Count of scanResults is 1
2023/02/16 18:18:17 Key: string User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: value.Values [{"buildingname":"Sterling Heights","doornumber":"12B","floor":5,"streetname":"Hill Street"}] false
2023/02/16 18:18:17 Expected and Actual scan responses are the same
--- PASS: TestLookupJsonObject (6.59s)
=== RUN   TestLookupObjDifferentOrdering
2023/02/16 18:18:17 In TestLookupObjDifferentOrdering()
2023/02/16 18:18:17 Index found:  index_streetaddress
2023/02/16 18:18:17 Using n1ql client
2023/02/16 18:18:17 Count of docScanResults is 1
2023/02/16 18:18:17 Key: User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: [map[buildingname:Sterling Heights doornumber:12B floor:5 streetname:Hill Street]]
2023/02/16 18:18:17 Count of scanResults is 1
2023/02/16 18:18:17 Key: string User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: value.Values [{"buildingname":"Sterling Heights","doornumber":"12B","floor":5,"streetname":"Hill Street"}] false
2023/02/16 18:18:17 Expected and Actual scan responses are the same
--- PASS: TestLookupObjDifferentOrdering (0.02s)
=== RUN   TestRangeJsonObject
2023/02/16 18:18:17 In TestRangeJsonObject()
2023/02/16 18:18:17 Index found:  index_streetaddress
2023/02/16 18:18:17 Using n1ql client
2023/02/16 18:18:17 Count of scanResults is 2
2023/02/16 18:18:17 Key: string Userbb48952f-f8d1-4e04-a0e1-96b9019706fb  Value: value.Values [{"buildingname":"Rosewood Gardens","doornumber":"514","floor":2,"streetname":"Karweg Place"}] false
2023/02/16 18:18:17 Key: string User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: value.Values [{"buildingname":"Sterling Heights","doornumber":"12B","floor":5,"streetname":"Hill Street"}] false
2023/02/16 18:18:17 Count of docScanResults is 2
2023/02/16 18:18:17 Key: User3bf51f08-0bac-4c03-bcec-5c255cbdde2c  Value: [map[buildingname:Sterling Heights doornumber:12B floor:5 streetname:Hill Street]]
2023/02/16 18:18:17 Key: Userbb48952f-f8d1-4e04-a0e1-96b9019706fb  Value: [map[buildingname:Rosewood Gardens doornumber:514 floor:2 streetname:Karweg Place]]
2023/02/16 18:18:17 Expected and Actual scan responses are the same
--- PASS: TestRangeJsonObject (0.00s)
=== RUN   TestLookupFloatDiffForms
2023/02/16 18:18:17 In TestLookupFloatDiffForms()
2023/02/16 18:18:23 Created the secondary index index_latitude. Waiting for it become active
2023/02/16 18:18:23 Index is 16970718883748822093 now active
2023/02/16 18:18:23 Scan 1
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 2
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 3
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 4
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 5
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 6
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
--- PASS: TestLookupFloatDiffForms (6.65s)
=== RUN   TestRangeFloatInclVariations
2023/02/16 18:18:23 In TestRangeFloatInclVariations()
2023/02/16 18:18:23 Index found:  index_latitude
2023/02/16 18:18:23 Scan 1
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 2
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 3
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 4
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 5
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
2023/02/16 18:18:23 Scan 6
2023/02/16 18:18:23 Using n1ql client
2023/02/16 18:18:23 Expected and Actual scan responses are the same
--- PASS: TestRangeFloatInclVariations (0.03s)
=== RUN   TestScanAll
2023/02/16 18:18:23 In TestScanAll()
2023/02/16 18:18:30 Created the secondary index index_name. Waiting for it become active
2023/02/16 18:18:30 Index is 9348257386221516346 now active
2023/02/16 18:18:30 Length of docScanResults = 10502
2023/02/16 18:18:30 Using n1ql client
2023/02/16 18:18:30 Length of scanResults = 10502
2023/02/16 18:18:30 Expected and Actual scan responses are the same
--- PASS: TestScanAll (6.58s)
=== RUN   TestScanAllNestedField
2023/02/16 18:18:30 In TestScanAllNestedField()
2023/02/16 18:18:30 Index found:  index_streetname
2023/02/16 18:18:30 Length of docScanResults = 2
2023/02/16 18:18:30 Using n1ql client
2023/02/16 18:18:30 Length of scanResults = 2
2023/02/16 18:18:30 Expected and Actual scan responses are the same
--- PASS: TestScanAllNestedField (0.01s)
=== RUN   TestBasicPrimaryIndex
2023/02/16 18:18:30 In TestBasicPrimaryIndex()
2023/02/16 18:18:36 Created the secondary index index_p1. Waiting for it become active
2023/02/16 18:18:36 Index is 11419975220829514252 now active
2023-02-16T18:18:36.733+05:30 [Error] transport error between 127.0.0.1:55338->127.0.0.1:9107: write tcp 127.0.0.1:55338->127.0.0.1:9107: write: broken pipe
2023-02-16T18:18:36.733+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"]  request transport failed `write tcp 127.0.0.1:55338->127.0.0.1:9107: write: broken pipe`
2023-02-16T18:18:36.733+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T18:18:36.733+05:30 [Error] metadataClient:PickRandom: Replicas - [13228084643903842194], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 18:18:36 Expected and Actual scan responses are the same
2023/02/16 18:18:36 CountRange() expected and actual is:  1876 and 1876
2023/02/16 18:18:36 lookupkey for CountLookup() = User04c8ebda-9069-4631-875c-1f02bbc18c5b
2023/02/16 18:18:36 CountLookup() = 1
--- PASS: TestBasicPrimaryIndex (6.30s)
=== RUN   TestBasicNullDataType
2023/02/16 18:18:36 In TestBasicNullDataType()
2023/02/16 18:18:36 Index found:  index_email
2023/02/16 18:18:36 Using n1ql client
2023/02/16 18:18:36 Expected and Actual scan responses are the same
--- PASS: TestBasicNullDataType (0.01s)
=== RUN   TestBasicArrayDataType_ScanAll
2023/02/16 18:18:36 In TestBasicArrayDataType_ScanAll()
2023/02/16 18:18:43 Created the secondary index index_tags. Waiting for it become active
2023/02/16 18:18:43 Index is 11341474633867297444 now active
2023/02/16 18:18:43 Using n1ql client
2023/02/16 18:18:43 Expected and Actual scan responses are the same
--- PASS: TestBasicArrayDataType_ScanAll (6.79s)
=== RUN   TestBasicArrayDataType_Lookup
2023/02/16 18:18:43 In TestBasicArrayDataType_Lookup()
2023/02/16 18:18:45 Index found:  index_tags
2023/02/16 18:18:45 Count of scanResults is 1
2023/02/16 18:18:45 Key: string Usere46cea01-38f6-4e7b-92e5-69d64668ae75  Value: value.Values [["reprehenderit","tempor","officia","exercitation","labore","sunt","tempor"]] false
--- PASS: TestBasicArrayDataType_Lookup (2.00s)
=== RUN   TestArrayDataType_LookupMissingArrayValue
2023/02/16 18:18:45 In TestArrayDataType_LookupMissingArrayValue()
2023/02/16 18:18:45 Index found:  index_tags
2023/02/16 18:18:45 Count of scanResults is 0
--- PASS: TestArrayDataType_LookupMissingArrayValue (0.00s)
=== RUN   TestArrayDataType_LookupWrongOrder
2023/02/16 18:18:45 In TestArrayDataType_LookupWrongOrder()
2023/02/16 18:18:45 Index found:  index_tags
2023/02/16 18:18:45 Count of scanResults is 0
--- PASS: TestArrayDataType_LookupWrongOrder (0.00s)
=== RUN   TestArrayDataType_LookupSubset
2023/02/16 18:18:45 In TestArrayDataType_LookupSubset()
2023/02/16 18:18:45 Index found:  index_tags
2023/02/16 18:18:45 Count of scanResults is 0
--- PASS: TestArrayDataType_LookupSubset (0.00s)
=== RUN   TestScanLimitParameter
2023/02/16 18:18:45 In TestScanLimitParameter()
2023/02/16 18:18:45 Index found:  index_age
2023/02/16 18:18:45 Using n1ql client
2023/02/16 18:18:45 Using n1ql client
--- PASS: TestScanLimitParameter (0.01s)
=== RUN   TestCountRange
2023/02/16 18:18:45 In TestRangeCount()
2023/02/16 18:18:45 Index found:  index_age
2023/02/16 18:18:45 Count of expected and actual Range is:  2397 and 2397
2023/02/16 18:18:45 Count of expected and actual Range is: 10002 and 10002
2023/02/16 18:18:45 Count of expected and actual Range are: 0 and 0
2023/02/16 18:18:45 Count of expected and actual Range are: 496 and 496
2023/02/16 18:18:45 Testing CountRange() for key <= val
2023/02/16 18:18:45 Count of expected and actual CountRange for key <= 30 are: 5227 and 5227
2023/02/16 18:18:45 Testing CountRange() for key >= val
2023/02/16 18:18:45 Count of expected and actual CountRange for key >= 25 are: 7676 and 7676
2023/02/16 18:18:45 Testing CountRange() for null < key <= val
2023/02/16 18:18:45 Count of expected and actual CountRange for key > null && key <= 30 are: 5227 and 5227
2023/02/16 18:18:45 Testing CountRange() for val <= key < null 
2023/02/16 18:18:45 Count of expected and actual CountRange for key >= 25 && key < null are: 0 and 0
2023/02/16 18:18:45 Count of expected and actual Range are: 0 and 0
--- PASS: TestCountRange (0.08s)
=== RUN   TestCountLookup
2023/02/16 18:18:45 In TestCountLookup()
2023/02/16 18:18:45 Index found:  index_age
2023/02/16 18:18:45 Count of expected and actual Range are: 505 and 505
2023/02/16 18:18:45 Count of expected and actual Range are: 0 and 0
--- PASS: TestCountLookup (0.01s)
=== RUN   TestRangeStatistics
2023/02/16 18:18:45 In TestRangeCount()
2023/02/16 18:18:45 Index found:  index_age
--- PASS: TestRangeStatistics (0.00s)
=== RUN   TestIndexCreateWithWhere
2023/02/16 18:18:45 In TestIndexCreateWithWhere()
2023/02/16 18:18:50 Created the secondary index index_ageabove30. Waiting for it become active
2023/02/16 18:18:50 Index is 13660731961231859991 now active
2023/02/16 18:18:50 Using n1ql client
2023/02/16 18:18:50 Expected and Actual scan responses are the same
2023/02/16 18:18:50 Lengths of expected and actual scanReuslts are:  4279 and 4279
2023/02/16 18:18:56 Created the secondary index index_ageteens. Waiting for it become active
2023/02/16 18:18:56 Index is 9731984766587501038 now active
2023/02/16 18:18:56 Using n1ql client
2023/02/16 18:18:56 Expected and Actual scan responses are the same
2023/02/16 18:18:56 Lengths of expected and actual scanReuslts are:  0 and 0
2023/02/16 18:19:03 Created the secondary index index_age35to45. Waiting for it become active
2023/02/16 18:19:03 Index is 11115705932659129513 now active
2023/02/16 18:19:03 Using n1ql client
2023/02/16 18:19:03 Expected and Actual scan responses are the same
2023/02/16 18:19:03 Lengths of expected and actual scanReuslts are:  2893 and 2893
--- PASS: TestIndexCreateWithWhere (17.63s)
=== RUN   TestDeferredIndexCreate
2023/02/16 18:19:03 In TestDeferredIndexCreate()
2023/02/16 18:19:03 Created the index index_deferred in deferred mode. Index state is INDEX_STATE_READY
2023/02/16 18:19:05 Build the deferred index index_deferred. Waiting for the index to become active
2023/02/16 18:19:05 Waiting for index 1562812630825535475 to go active ...
2023/02/16 18:19:06 Waiting for index 1562812630825535475 to go active ...
2023/02/16 18:19:07 Waiting for index 1562812630825535475 to go active ...
2023/02/16 18:19:08 Waiting for index 1562812630825535475 to go active ...
2023/02/16 18:19:09 Waiting for index 1562812630825535475 to go active ...
2023/02/16 18:19:10 Index is 1562812630825535475 now active
2023/02/16 18:19:10 Using n1ql client
2023/02/16 18:19:10 Expected and Actual scan responses are the same
--- PASS: TestDeferredIndexCreate (7.17s)
=== RUN   TestCompositeIndex_NumAndString
2023/02/16 18:19:10 In TestCompositeIndex()
2023/02/16 18:19:17 Created the secondary index index_composite1. Waiting for it become active
2023/02/16 18:19:17 Index is 16907485462709811691 now active
2023/02/16 18:19:17 Using n1ql client
2023/02/16 18:19:17 Using n1ql client
2023/02/16 18:19:17 Using n1ql client
2023/02/16 18:19:17 Expected and Actual scan responses are the same
--- PASS: TestCompositeIndex_NumAndString (6.90s)
=== RUN   TestCompositeIndex_TwoNumberFields
2023/02/16 18:19:17 In TestCompositeIndex()
2023/02/16 18:19:23 Created the secondary index index_composite2. Waiting for it become active
2023/02/16 18:19:23 Index is 14147682957936109650 now active
2023/02/16 18:19:23 Using n1ql client
--- PASS: TestCompositeIndex_TwoNumberFields (6.66s)
=== RUN   TestNumbers_Int64_Float64
2023/02/16 18:19:24 In TestNumbers_Int64_Float64()
2023/02/16 18:19:30 Created the secondary index idx_numbertest. Waiting for it become active
2023/02/16 18:19:30 Index is 12750943110430578747 now active
2023/02/16 18:19:30 
 ==== Int64 test #0
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #1
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #2
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #3
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #4
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #5
2023/02/16 18:19:30 Using n1ql client
2023/02/16 18:19:30 Expected and Actual scan responses are the same
2023/02/16 18:19:30 
 ==== Int64 test #6
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Int64 test #7
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Int64 test #8
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Float64 test #0
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Float64 test #1
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Float64 test #2
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
2023/02/16 18:19:31 
 ==== Float64 test #3
2023/02/16 18:19:31 Using n1ql client
2023/02/16 18:19:31 Expected and Actual scan responses are the same
--- PASS: TestNumbers_Int64_Float64 (7.31s)
=== RUN   TestRestartIndexer
2023/02/16 18:19:31 In TestRestartIndexer()
2023/02/16 18:19:31 []
2023-02-16T18:19:31.520+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T18:19:31.520+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 18:19:51 Using n1ql client
2023-02-16T18:19:51.460+05:30 [Error] transport error between 127.0.0.1:57804->127.0.0.1:9107: write tcp 127.0.0.1:57804->127.0.0.1:9107: write: broken pipe
2023-02-16T18:19:51.460+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 2905042727476577114 request transport failed `write tcp 127.0.0.1:57804->127.0.0.1:9107: write: broken pipe`
2023/02/16 18:19:51 Len of expected and actual scan results are :  10002 and 10002
2023/02/16 18:19:51 Expected and Actual scan responses are the same
--- PASS: TestRestartIndexer (20.15s)
=== RUN   TestCreateDocsMutation
2023/02/16 18:19:51 In TestCreateDocsMutation()
2023/02/16 18:19:51 Index found:  index_age
2023/02/16 18:19:51 Using n1ql client
2023/02/16 18:19:51 Len of expected and actual scan results are :  10002 and 10002
2023/02/16 18:19:51 Expected and Actual scan responses are the same
2023/02/16 18:19:51 Using n1ql client
2023/02/16 18:19:52 Index Scan after mutations took 166.55171ms
2023/02/16 18:19:52 Len of expected and actual scan results are :  10102 and 10102
2023/02/16 18:19:52 Expected and Actual scan responses are the same
--- PASS: TestCreateDocsMutation (0.45s)
=== RUN   TestRestartProjector
2023/02/16 18:19:52 In TestRestartProjector()
2023/02/16 18:19:52 []
2023/02/16 18:20:12 Using n1ql client
2023/02/16 18:20:12 Len of expected and actual scan results are :  10102 and 10102
2023/02/16 18:20:12 Expected and Actual scan responses are the same
--- PASS: TestRestartProjector (20.07s)
=== RUN   TestDeleteDocsMutation
2023/02/16 18:20:12 In TestDeleteDocsMutation()
2023/02/16 18:20:12 Index found:  index_age
2023/02/16 18:20:12 Using n1ql client
2023/02/16 18:20:12 Len of expected and actual scan results are :  10102 and 10102
2023/02/16 18:20:12 Expected and Actual scan responses are the same
2023/02/16 18:20:12 Using n1ql client
2023/02/16 18:20:12 Index Scan after mutations took 195.116606ms
2023/02/16 18:20:12 Len of expected and actual scan results are :  9902 and 9902
2023/02/16 18:20:12 Expected and Actual scan responses are the same
--- PASS: TestDeleteDocsMutation (0.64s)
=== RUN   TestUpdateDocsMutation
2023/02/16 18:20:12 In TestUpdateDocsMutation()
2023/02/16 18:20:12 Index found:  index_age
2023/02/16 18:20:12 Using n1ql client
2023/02/16 18:20:12 Len of expected and actual scan results are :  9436 and 9436
2023/02/16 18:20:12 Expected and Actual scan responses are the same
2023/02/16 18:20:12 Num of keysFromMutDocs: 100
2023/02/16 18:20:12 Updating number of documents: 99
2023/02/16 18:20:13 Using n1ql client
2023/02/16 18:20:13 Index Scan after mutations took 195.098305ms
2023/02/16 18:20:13 Len of expected and actual scan results are :  9436 and 9436
2023/02/16 18:20:13 Expected and Actual scan responses are the same
--- PASS: TestUpdateDocsMutation (0.53s)
=== RUN   TestLargeMutations
2023/02/16 18:20:13 In TestLargeMutations()
2023/02/16 18:20:13 In DropAllSecondaryIndexes()
2023/02/16 18:20:13 Index found:  index_streetname
2023/02/16 18:20:13 Dropped index index_streetname
2023/02/16 18:20:13 Index found:  index_company
2023/02/16 18:20:13 Dropped index index_company
2023/02/16 18:20:13 Index found:  index_state
2023/02/16 18:20:13 Dropped index index_state
2023/02/16 18:20:13 Index found:  index_p1
2023/02/16 18:20:13 Dropped index index_p1
2023/02/16 18:20:13 Index found:  index_balance
2023/02/16 18:20:13 Dropped index index_balance
2023/02/16 18:20:13 Index found:  index_composite1
2023/02/16 18:20:13 Dropped index index_composite1
2023/02/16 18:20:13 Index found:  index_gender
2023/02/16 18:20:14 Dropped index index_gender
2023/02/16 18:20:14 Index found:  index_streetaddress
2023/02/16 18:20:14 Dropped index index_streetaddress
2023/02/16 18:20:14 Index found:  index_tags
2023/02/16 18:20:14 Dropped index index_tags
2023/02/16 18:20:14 Index found:  index_email
2023/02/16 18:20:14 Dropped index index_email
2023/02/16 18:20:14 Index found:  index_di1
2023/02/16 18:20:14 Dropped index index_di1
2023/02/16 18:20:14 Index found:  index_nationality
2023/02/16 18:20:14 Dropped index index_nationality
2023/02/16 18:20:14 Index found:  index_isresidential
2023/02/16 18:20:14 Dropped index index_isresidential
2023/02/16 18:20:14 Index found:  index_height
2023/02/16 18:20:14 Dropped index index_height
2023/02/16 18:20:14 Index found:  index_longitude
2023/02/16 18:20:14 Dropped index index_longitude
2023/02/16 18:20:14 Index found:  idx_age
2023/02/16 18:20:14 Dropped index idx_age
2023/02/16 18:20:14 Index found:  index_deferred
2023/02/16 18:20:14 Dropped index index_deferred
2023/02/16 18:20:14 Index found:  index_composite2
2023/02/16 18:20:15 Dropped index index_composite2
2023/02/16 18:20:15 Index found:  idx_numbertest
2023/02/16 18:20:15 Dropped index idx_numbertest
2023/02/16 18:20:15 Index found:  index_latitude
2023/02/16 18:20:15 Dropped index index_latitude
2023/02/16 18:20:15 Index found:  index_name
2023/02/16 18:20:15 Dropped index index_name
2023/02/16 18:20:15 Index found:  index_ageabove30
2023/02/16 18:20:15 Dropped index index_ageabove30
2023/02/16 18:20:15 Index found:  index_eyeColor
2023/02/16 18:20:15 Dropped index index_eyeColor
2023/02/16 18:20:15 Index found:  index_age
2023/02/16 18:20:15 Dropped index index_age
2023/02/16 18:20:15 Index found:  index_ageteens
2023/02/16 18:20:15 Dropped index index_ageteens
2023/02/16 18:20:15 Index found:  index_isActive
2023/02/16 18:20:15 Dropped index index_isActive
2023/02/16 18:20:15 Index found:  index_i2
2023/02/16 18:20:15 Dropped index index_i2
2023/02/16 18:20:15 Index found:  index_age35to45
2023/02/16 18:20:15 Dropped index index_age35to45
2023/02/16 18:20:15 Index found:  index_cdc
2023/02/16 18:20:15 Dropped index index_cdc
2023/02/16 18:20:15 Index found:  index_floor
2023/02/16 18:20:16 Dropped index index_floor
2023/02/16 18:20:16 Index found:  index_pin
2023/02/16 18:20:16 Dropped index index_pin
2023/02/16 18:20:37 Created the secondary index indexmut_1. Waiting for it become active
2023/02/16 18:20:37 Index is 3175816723488858201 now active
2023/02/16 18:20:37 Using n1ql client
2023/02/16 18:20:37 Expected and Actual scan responses are the same
2023/02/16 18:20:37 Len of expected and actual scan results are :  29902 and 29902
2023/02/16 18:20:37 ITERATION 0
2023/02/16 18:20:56 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:20:56 Index is 5682773917611740595 now active
2023/02/16 18:20:56 Using n1ql client
2023/02/16 18:20:57 Expected and Actual scan responses are the same
2023/02/16 18:20:57 Len of expected and actual scan results are :  39902 and 39902
2023/02/16 18:20:57 Using n1ql client
2023/02/16 18:20:57 Expected and Actual scan responses are the same
2023/02/16 18:20:57 Len of expected and actual scan results are :  39902 and 39902
2023/02/16 18:20:57 Dropping the secondary index indexmut_2
2023/02/16 18:20:57 Index dropped
2023/02/16 18:20:57 ITERATION 1
2023/02/16 18:21:15 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:21:15 Index is 15509508388423394383 now active
2023/02/16 18:21:15 Using n1ql client
2023/02/16 18:21:16 Expected and Actual scan responses are the same
2023/02/16 18:21:16 Len of expected and actual scan results are :  49902 and 49902
2023/02/16 18:21:16 Using n1ql client
2023/02/16 18:21:16 Expected and Actual scan responses are the same
2023/02/16 18:21:16 Len of expected and actual scan results are :  49902 and 49902
2023/02/16 18:21:16 Dropping the secondary index indexmut_2
2023/02/16 18:21:16 Index dropped
2023/02/16 18:21:16 ITERATION 2
2023/02/16 18:21:35 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:21:35 Index is 15009632959763374676 now active
2023/02/16 18:21:35 Using n1ql client
2023/02/16 18:21:36 Expected and Actual scan responses are the same
2023/02/16 18:21:36 Len of expected and actual scan results are :  59902 and 59902
2023/02/16 18:21:36 Using n1ql client
2023/02/16 18:21:36 Expected and Actual scan responses are the same
2023/02/16 18:21:36 Len of expected and actual scan results are :  59902 and 59902
2023/02/16 18:21:36 Dropping the secondary index indexmut_2
2023/02/16 18:21:36 Index dropped
2023/02/16 18:21:36 ITERATION 3
2023/02/16 18:21:58 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:21:58 Index is 11198258941666354548 now active
2023/02/16 18:21:58 Using n1ql client
2023/02/16 18:21:59 Expected and Actual scan responses are the same
2023/02/16 18:21:59 Len of expected and actual scan results are :  69902 and 69902
2023/02/16 18:21:59 Using n1ql client
2023/02/16 18:21:59 Expected and Actual scan responses are the same
2023/02/16 18:21:59 Len of expected and actual scan results are :  69902 and 69902
2023/02/16 18:21:59 Dropping the secondary index indexmut_2
2023/02/16 18:21:59 Index dropped
2023/02/16 18:21:59 ITERATION 4
2023/02/16 18:22:21 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:22:21 Index is 15264741248068164677 now active
2023/02/16 18:22:22 Using n1ql client
2023/02/16 18:22:22 Expected and Actual scan responses are the same
2023/02/16 18:22:22 Len of expected and actual scan results are :  79902 and 79902
2023/02/16 18:22:22 Using n1ql client
2023/02/16 18:22:23 Expected and Actual scan responses are the same
2023/02/16 18:22:23 Len of expected and actual scan results are :  79902 and 79902
2023/02/16 18:22:23 Dropping the secondary index indexmut_2
2023/02/16 18:22:23 Index dropped
2023/02/16 18:22:23 ITERATION 5
2023/02/16 18:22:44 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:22:44 Index is 15618579251820280300 now active
2023/02/16 18:22:44 Using n1ql client
2023/02/16 18:22:45 Expected and Actual scan responses are the same
2023/02/16 18:22:45 Len of expected and actual scan results are :  89902 and 89902
2023/02/16 18:22:45 Using n1ql client
2023/02/16 18:22:46 Expected and Actual scan responses are the same
2023/02/16 18:22:46 Len of expected and actual scan results are :  89902 and 89902
2023/02/16 18:22:46 Dropping the secondary index indexmut_2
2023/02/16 18:22:46 Index dropped
2023/02/16 18:22:46 ITERATION 6
2023/02/16 18:23:07 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:23:07 Index is 15398526283214037181 now active
2023/02/16 18:23:08 Using n1ql client
2023/02/16 18:23:08 Expected and Actual scan responses are the same
2023/02/16 18:23:08 Len of expected and actual scan results are :  99902 and 99902
2023/02/16 18:23:08 Using n1ql client
2023/02/16 18:23:09 Expected and Actual scan responses are the same
2023/02/16 18:23:09 Len of expected and actual scan results are :  99902 and 99902
2023/02/16 18:23:09 Dropping the secondary index indexmut_2
2023/02/16 18:23:09 Index dropped
2023/02/16 18:23:09 ITERATION 7
2023/02/16 18:23:33 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:23:33 Index is 16266910872047016197 now active
2023/02/16 18:23:33 Using n1ql client
2023/02/16 18:23:34 Expected and Actual scan responses are the same
2023/02/16 18:23:34 Len of expected and actual scan results are :  109902 and 109902
2023/02/16 18:23:34 Using n1ql client
2023/02/16 18:23:36 Expected and Actual scan responses are the same
2023/02/16 18:23:36 Len of expected and actual scan results are :  109902 and 109902
2023/02/16 18:23:36 Dropping the secondary index indexmut_2
2023/02/16 18:23:36 Index dropped
2023/02/16 18:23:36 ITERATION 8
2023/02/16 18:23:59 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:23:59 Index is 13506727452194939979 now active
2023/02/16 18:23:59 Using n1ql client
2023/02/16 18:24:00 Expected and Actual scan responses are the same
2023/02/16 18:24:00 Len of expected and actual scan results are :  119902 and 119902
2023/02/16 18:24:00 Using n1ql client
2023/02/16 18:24:01 Expected and Actual scan responses are the same
2023/02/16 18:24:01 Len of expected and actual scan results are :  119902 and 119902
2023/02/16 18:24:01 Dropping the secondary index indexmut_2
2023/02/16 18:24:01 Index dropped
2023/02/16 18:24:01 ITERATION 9
2023/02/16 18:24:26 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:24:26 Index is 1077559650024410445 now active
2023/02/16 18:24:26 Using n1ql client
2023/02/16 18:24:27 Expected and Actual scan responses are the same
2023/02/16 18:24:27 Len of expected and actual scan results are :  129902 and 129902
2023/02/16 18:24:27 Using n1ql client
2023/02/16 18:24:28 Expected and Actual scan responses are the same
2023/02/16 18:24:28 Len of expected and actual scan results are :  129902 and 129902
2023/02/16 18:24:28 Dropping the secondary index indexmut_2
2023/02/16 18:24:28 Index dropped
2023/02/16 18:24:28 ITERATION 10
2023/02/16 18:24:55 Created the secondary index indexmut_2. Waiting for it become active
2023/02/16 18:24:55 Index is 14917452291346106710 now active
2023/02/16 18:24:55 Using n1ql client
2023/02/16 18:24:56 Expected and Actual scan responses are the same
2023/02/16 18:24:56 Len of expected and actual scan results are :  139902 and 139902
2023/02/16 18:24:56 Using n1ql client
2023/02/16 18:24:57 Expected and Actual scan responses are the same
2023/02/16 18:24:57 Len of expected and actual scan results are :  139902 and 139902
2023/02/16 18:24:57 Dropping the secondary index indexmut_2
2023/02/16 18:24:57 Index dropped
--- PASS: TestLargeMutations (284.28s)
=== RUN   TestPlanner
2023/02/16 18:24:57 In TestPlanner()
2023/02/16 18:24:57 -------------------------------------------
2023/02/16 18:24:57 initial placement - 20-50M, 10 index, 3 replica, 2x
2023-02-16T18:24:57.545+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:24:57.545+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:24:57.545+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:24:57.549+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:24:57.549+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:24:57.551+05:30 [Info] switched currmeta from 482 -> 482 force true 
2023-02-16T18:24:57.554+05:30 [Info] switched currmeta from 477 -> 478 force true 
2023-02-16T18:24:57.560+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:24:57.560+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:24:57.562+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:24:57.562+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:24:57.564+05:30 [Info] switched currmeta from 482 -> 482 force true 
2023-02-16T18:24:57.570+05:30 [Info] switched currmeta from 478 -> 478 force true 
2023-02-16T18:24:57.650+05:30 [Info] Planner::finalizing the solution as there are no more valid index movements.
2023-02-16T18:24:57.650+05:30 [Info] Score: 0.03776642789162431
2023-02-16T18:24:57.650+05:30 [Info] Memory Quota: 52798096354 (49.1721G)
2023-02-16T18:24:57.650+05:30 [Info] CPU Quota: 12
2023-02-16T18:24:57.650+05:30 [Info] Indexer Memory Mean 33126232971 (30.8512G)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Memory Deviation 2502118977 (2.33028G) (7.55%)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Memory Utilization 0.6274
2023-02-16T18:24:57.650+05:30 [Info] Indexer CPU Mean 8.2996
2023-02-16T18:24:57.650+05:30 [Info] Indexer CPU Deviation 2.43 (29.26%)
2023-02-16T18:24:57.650+05:30 [Info] Indexer CPU Utilization 0.6916
2023-02-16T18:24:57.650+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:24:57.650+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:24:57.650+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:24:57.650+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Data Size Mean 33126232971 (30.8512G)
2023-02-16T18:24:57.650+05:30 [Info] Indexer Data Size Deviation 2502118977 (2.33028G) (7.55%)
2023-02-16T18:24:57.651+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:24:57.651+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:24:57.651+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:24:57.651+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:24:57 -------------------------------------------
2023/02/16 18:24:57 initial placement - 20-50M, 30 index, 3 replica, 2x
2023-02-16T18:24:57.651+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:24:58.651+05:30 [Info] Score: 0.02876849161018266
2023-02-16T18:24:58.651+05:30 [Info] Memory Quota: 60731228814 (56.5604G)
2023-02-16T18:24:58.651+05:30 [Info] CPU Quota: 12
2023-02-16T18:24:58.651+05:30 [Info] Indexer Memory Mean 37219982524 (34.6638G)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Memory Deviation 2141525509 (1.99445G) (5.75%)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Memory Utilization 0.6129
2023-02-16T18:24:58.651+05:30 [Info] Indexer CPU Mean 11.1655
2023-02-16T18:24:58.651+05:30 [Info] Indexer CPU Deviation 2.25 (20.14%)
2023-02-16T18:24:58.651+05:30 [Info] Indexer CPU Utilization 0.9305
2023-02-16T18:24:58.651+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:24:58.651+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:24:58.651+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:24:58.651+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Data Size Mean 37219982524 (34.6638G)
2023-02-16T18:24:58.651+05:30 [Info] Indexer Data Size Deviation 2141525509 (1.99445G) (5.75%)
2023-02-16T18:24:58.651+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:24:58.651+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:24:58.651+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:24:58.651+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:24:58 -------------------------------------------
2023/02/16 18:24:58 initial placement - 20-50M, 30 index, 3 replica, 4x
2023-02-16T18:24:58.651+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:24:58.892+05:30 [Info] Score: 0.01758221679808122
2023-02-16T18:24:58.892+05:30 [Info] Memory Quota: 134701268852 (125.45G)
2023-02-16T18:24:58.892+05:30 [Info] CPU Quota: 24
2023-02-16T18:24:58.892+05:30 [Info] Indexer Memory Mean 86854560900 (80.8896G)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Memory Deviation 3054191439 (2.84444G) (3.52%)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Memory Utilization 0.6448
2023-02-16T18:24:58.892+05:30 [Info] Indexer CPU Mean 22.6732
2023-02-16T18:24:58.892+05:30 [Info] Indexer CPU Deviation 3.32 (14.66%)
2023-02-16T18:24:58.892+05:30 [Info] Indexer CPU Utilization 0.9447
2023-02-16T18:24:58.892+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:24:58.892+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:24:58.892+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:24:58.892+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Data Size Mean 86854560900 (80.8896G)
2023-02-16T18:24:58.892+05:30 [Info] Indexer Data Size Deviation 3054191439 (2.84444G) (3.52%)
2023-02-16T18:24:58.892+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:24:58.892+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:24:58.892+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:24:58.892+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:24:58 -------------------------------------------
2023/02/16 18:24:58 initial placement - 200-500M, 10 index, 3 replica, 2x
2023-02-16T18:24:58.893+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:24:58.943+05:30 [Info] Score: 0.024031470600575477
2023-02-16T18:24:58.943+05:30 [Info] Memory Quota: 483530836238 (450.323G)
2023-02-16T18:24:58.943+05:30 [Info] CPU Quota: 10
2023-02-16T18:24:58.943+05:30 [Info] Indexer Memory Mean 370999720440 (345.52G)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Memory Deviation 17831337749 (16.6067G) (4.81%)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Memory Utilization 0.7673
2023-02-16T18:24:58.943+05:30 [Info] Indexer CPU Mean 9.9183
2023-02-16T18:24:58.943+05:30 [Info] Indexer CPU Deviation 1.98 (19.93%)
2023-02-16T18:24:58.943+05:30 [Info] Indexer CPU Utilization 0.9918
2023-02-16T18:24:58.943+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:24:58.943+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:24:58.943+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:24:58.943+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Data Size Mean 370999720440 (345.52G)
2023-02-16T18:24:58.943+05:30 [Info] Indexer Data Size Deviation 17831337749 (16.6067G) (4.81%)
2023-02-16T18:24:58.943+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:24:58.943+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:24:58.943+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:24:58.943+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:24:58 -------------------------------------------
2023/02/16 18:24:58 initial placement - 200-500M, 30 index, 3 replica, 2x
2023-02-16T18:24:58.944+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:24:59.995+05:30 [Info] Score: 0.024293284068718914
2023-02-16T18:24:59.995+05:30 [Info] Memory Quota: 555205341384 (517.075G)
2023-02-16T18:24:59.995+05:30 [Info] CPU Quota: 12
2023-02-16T18:24:59.995+05:30 [Info] Indexer Memory Mean 431059314292 (401.455G)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Memory Deviation 20943692745 (19.5053G) (4.86%)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Memory Utilization 0.7764
2023-02-16T18:24:59.995+05:30 [Info] Indexer CPU Mean 12.9636
2023-02-16T18:24:59.995+05:30 [Info] Indexer CPU Deviation 2.96 (22.84%)
2023-02-16T18:24:59.995+05:30 [Info] Indexer CPU Utilization 1.0803
2023-02-16T18:24:59.995+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:24:59.995+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:24:59.995+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:24:59.995+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Data Size Mean 431059314292 (401.455G)
2023-02-16T18:24:59.995+05:30 [Info] Indexer Data Size Deviation 20943692745 (19.5053G) (4.86%)
2023-02-16T18:24:59.995+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:24:59.995+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:24:59.995+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:24:59.995+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:24:59 -------------------------------------------
2023/02/16 18:24:59 initial placement - mixed small/medium, 30 index, 3 replica, 1.5/4x
2023-02-16T18:24:59.996+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:25:00.200+05:30 [Info] Score: 0.011777552666422733
2023-02-16T18:25:00.200+05:30 [Info] Memory Quota: 335332648317 (312.303G)
2023-02-16T18:25:00.200+05:30 [Info] CPU Quota: 16
2023-02-16T18:25:00.200+05:30 [Info] Indexer Memory Mean 262423545882 (244.401G)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Memory Deviation 6181414265 (5.75689G) (2.36%)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Memory Utilization 0.7826
2023-02-16T18:25:00.200+05:30 [Info] Indexer CPU Mean 12.9031
2023-02-16T18:25:00.200+05:30 [Info] Indexer CPU Deviation 7.89 (61.12%)
2023-02-16T18:25:00.200+05:30 [Info] Indexer CPU Utilization 0.8064
2023-02-16T18:25:00.200+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:25:00.200+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:25:00.200+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:25:00.200+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Data Size Mean 262423545882 (244.401G)
2023-02-16T18:25:00.200+05:30 [Info] Indexer Data Size Deviation 6181414265 (5.75689G) (2.36%)
2023-02-16T18:25:00.200+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:25:00.200+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:25:00.200+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:25:00.200+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 initial placement - mixed all, 30 index, 3 replica, 1.5/4x
2023-02-16T18:25:00.201+05:30 [Info] Planner::planSingleRun Initial variance of the solution: 0
2023-02-16T18:25:00.891+05:30 [Info] serviceChangeNotifier: received PoolChangeNotification
2023-02-16T18:25:00.895+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:25:00.896+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:25:00.896+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T18:25:00.896+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T18:25:00.897+05:30 [Info] switched currmeta from 482 -> 482 force true 
2023-02-16T18:25:00.898+05:30 [Info] switched currmeta from 478 -> 478 force true 
2023-02-16T18:25:00.966+05:30 [Info] Score: 0.030884303226105667
2023-02-16T18:25:00.966+05:30 [Info] Memory Quota: 327411603075 (304.926G)
2023-02-16T18:25:00.966+05:30 [Info] CPU Quota: 24
2023-02-16T18:25:00.966+05:30 [Info] Indexer Memory Mean 258791210638 (241.018G)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Memory Deviation 15985172443 (14.8874G) (6.18%)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Memory Utilization 0.7904
2023-02-16T18:25:00.966+05:30 [Info] Indexer CPU Mean 7.8960
2023-02-16T18:25:00.966+05:30 [Info] Indexer CPU Deviation 4.70 (59.50%)
2023-02-16T18:25:00.966+05:30 [Info] Indexer CPU Utilization 0.3290
2023-02-16T18:25:00.966+05:30 [Info] Indexer IO Mean 0.0000
2023-02-16T18:25:00.966+05:30 [Info] Indexer IO Deviation 0.00 (0.00%)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Drain Rate Mean 0.0000
2023-02-16T18:25:00.966+05:30 [Info] Indexer Drain Rate Deviation 0.00 (0.00%)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Scan Rate Mean 0.0000
2023-02-16T18:25:00.966+05:30 [Info] Indexer Scan Rate Deviation 0.00 (0.00%)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Data Size Mean 258791210638 (241.018G)
2023-02-16T18:25:00.966+05:30 [Info] Indexer Data Size Deviation 15985172443 (14.8874G) (6.18%)
2023-02-16T18:25:00.966+05:30 [Info] Total Index Data (from non-deleted node) 0
2023-02-16T18:25:00.966+05:30 [Info] Index Data Moved (exclude new node) 0 (0.00%)
2023-02-16T18:25:00.966+05:30 [Info] No. Index (from non-deleted node) 0
2023-02-16T18:25:00.966+05:30 [Info] No. Index Moved (exclude new node) 0 (0.00%)
    set03_planner_test.go:423: validation fails: cpu usage of indexer does not match sum of index cpu use
--- FAIL: TestPlanner (3.42s)
=== RUN   TestGreedyPlanner
2023/02/16 18:25:00 In TestGreedyPlanner()
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Single Index Instance - 3 empty nodes - 1 SG
2023-02-16T18:25:00.970+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Single Index Instance - 2 empty nodes, 1 non-empty node - 1 SG
2023-02-16T18:25:00.972+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Single Index Instance - 1 empty node, 2 non-empty nodes - 1 SG
2023-02-16T18:25:00.974+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Single Index Instance - 3 non-empty nodes - 1 SG
2023-02-16T18:25:00.976+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 3 empty nodes - 1 SG
2023-02-16T18:25:00.978+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 2 empty nodes, 1 non-empty node - 1 SG
2023-02-16T18:25:00.980+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 1 empty node, 2 non-empty nodes - 1 SG
2023-02-16T18:25:00.982+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 3 non-empty nodes - 1 SG
2023-02-16T18:25:00.984+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 2 Replica - 3 empty nodes - 1 SG
2023-02-16T18:25:00.986+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 2 Replica - 3 non-empty nodes - 1 SG
2023-02-16T18:25:00.988+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 2 empty nodes, 1 non-empty node - 2 SG
2023-02-16T18:25:00.990+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 1 empty node, 2 non-empty nodes - 2 SG
2023-02-16T18:25:00.992+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Index With 1 Replica - 3 non-empty nodes - 2 SG
2023-02-16T18:25:00.994+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Equivalent Index Without any replica - 3 non-empty nodes - 1 SG
2023-02-16T18:25:00.996+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Equivalent Index With 1 Replica - 3 non-empty nodes - 1 SG - Skip least loaded node
2023-02-16T18:25:00.998+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:00 -------------------------------------------
2023/02/16 18:25:00 Place Equivalent Index With 1 Replica - 3 non-empty nodes - 1 SG - Use least loaded node
2023-02-16T18:25:01.000+05:30 [Info] Using greedy index placement for index 987654
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place 60 index instaces on 3 empty nodes - 1 SG
2023-02-16T18:25:01.002+05:30 [Info] Using greedy index placement for index 1000987654
2023-02-16T18:25:01.004+05:30 [Info] Using greedy index placement for index 1001987654
2023-02-16T18:25:01.005+05:30 [Info] Using greedy index placement for index 1002987654
2023-02-16T18:25:01.007+05:30 [Info] Using greedy index placement for index 1003987654
2023-02-16T18:25:01.009+05:30 [Info] Using greedy index placement for index 1004987654
2023-02-16T18:25:01.011+05:30 [Info] Using greedy index placement for index 1005987654
2023-02-16T18:25:01.012+05:30 [Info] Using greedy index placement for index 1006987654
2023-02-16T18:25:01.014+05:30 [Info] Using greedy index placement for index 1007987654
2023-02-16T18:25:01.016+05:30 [Info] Using greedy index placement for index 1008987654
2023-02-16T18:25:01.018+05:30 [Info] Using greedy index placement for index 1009987654
2023-02-16T18:25:01.020+05:30 [Info] Using greedy index placement for index 1010987654
2023-02-16T18:25:01.022+05:30 [Info] Using greedy index placement for index 1011987654
2023-02-16T18:25:01.024+05:30 [Info] Using greedy index placement for index 1012987654
2023-02-16T18:25:01.026+05:30 [Info] Using greedy index placement for index 1013987654
2023-02-16T18:25:01.027+05:30 [Info] Using greedy index placement for index 1014987654
2023-02-16T18:25:01.029+05:30 [Info] Using greedy index placement for index 1015987654
2023-02-16T18:25:01.031+05:30 [Info] Using greedy index placement for index 1016987654
2023-02-16T18:25:01.032+05:30 [Info] Using greedy index placement for index 1017987654
2023-02-16T18:25:01.034+05:30 [Info] Using greedy index placement for index 1018987654
2023-02-16T18:25:01.036+05:30 [Info] Using greedy index placement for index 1019987654
2023-02-16T18:25:01.038+05:30 [Info] Using greedy index placement for index 1020987654
2023-02-16T18:25:01.039+05:30 [Info] Using greedy index placement for index 1021987654
2023-02-16T18:25:01.041+05:30 [Info] Using greedy index placement for index 1022987654
2023-02-16T18:25:01.043+05:30 [Info] Using greedy index placement for index 1023987654
2023-02-16T18:25:01.045+05:30 [Info] Using greedy index placement for index 1024987654
2023-02-16T18:25:01.046+05:30 [Info] Using greedy index placement for index 1025987654
2023-02-16T18:25:01.048+05:30 [Info] Using greedy index placement for index 1026987654
2023-02-16T18:25:01.050+05:30 [Info] Using greedy index placement for index 1027987654
2023-02-16T18:25:01.052+05:30 [Info] Using greedy index placement for index 1028987654
2023-02-16T18:25:01.054+05:30 [Info] Using greedy index placement for index 1029987654
2023-02-16T18:25:01.055+05:30 [Info] Using greedy index placement for index 1030987654
2023-02-16T18:25:01.057+05:30 [Info] Using greedy index placement for index 1031987654
2023-02-16T18:25:01.059+05:30 [Info] Using greedy index placement for index 1032987654
2023-02-16T18:25:01.061+05:30 [Info] Using greedy index placement for index 1033987654
2023-02-16T18:25:01.063+05:30 [Info] Using greedy index placement for index 1034987654
2023-02-16T18:25:01.065+05:30 [Info] Using greedy index placement for index 1035987654
2023-02-16T18:25:01.067+05:30 [Info] Using greedy index placement for index 1036987654
2023-02-16T18:25:01.069+05:30 [Info] Using greedy index placement for index 1037987654
2023-02-16T18:25:01.071+05:30 [Info] Using greedy index placement for index 1038987654
2023-02-16T18:25:01.072+05:30 [Info] Using greedy index placement for index 1039987654
2023-02-16T18:25:01.074+05:30 [Info] Using greedy index placement for index 1040987654
2023-02-16T18:25:01.076+05:30 [Info] Using greedy index placement for index 1041987654
2023-02-16T18:25:01.078+05:30 [Info] Using greedy index placement for index 1042987654
2023-02-16T18:25:01.080+05:30 [Info] Using greedy index placement for index 1043987654
2023-02-16T18:25:01.082+05:30 [Info] Using greedy index placement for index 1044987654
2023-02-16T18:25:01.084+05:30 [Info] Using greedy index placement for index 1045987654
2023-02-16T18:25:01.086+05:30 [Info] Using greedy index placement for index 1046987654
2023-02-16T18:25:01.088+05:30 [Info] Using greedy index placement for index 1047987654
2023-02-16T18:25:01.089+05:30 [Info] Using greedy index placement for index 1048987654
2023-02-16T18:25:01.091+05:30 [Info] Using greedy index placement for index 1049987654
2023-02-16T18:25:01.093+05:30 [Info] Using greedy index placement for index 1050987654
2023-02-16T18:25:01.095+05:30 [Info] Using greedy index placement for index 1051987654
2023-02-16T18:25:01.097+05:30 [Info] Using greedy index placement for index 1052987654
2023-02-16T18:25:01.098+05:30 [Info] Using greedy index placement for index 1053987654
2023-02-16T18:25:01.100+05:30 [Info] Using greedy index placement for index 1054987654
2023-02-16T18:25:01.102+05:30 [Info] Using greedy index placement for index 1055987654
2023-02-16T18:25:01.104+05:30 [Info] Using greedy index placement for index 1056987654
2023-02-16T18:25:01.106+05:30 [Info] Using greedy index placement for index 1057987654
2023-02-16T18:25:01.108+05:30 [Info] Using greedy index placement for index 1058987654
2023-02-16T18:25:01.110+05:30 [Info] Using greedy index placement for index 1059987654
2023-02-16T18:25:01.110+05:30 [Info] Actual variance of deferred index count across nodes is 0
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place 60 index instaces on 1 empty and 1 10 percent filled node - 1 SG
2023-02-16T18:25:01.112+05:30 [Info] Using greedy index placement for index 1000987654
2023-02-16T18:25:01.114+05:30 [Info] Using greedy index placement for index 1001987654
2023-02-16T18:25:01.115+05:30 [Info] Using greedy index placement for index 1002987654
2023-02-16T18:25:01.117+05:30 [Info] Using greedy index placement for index 1003987654
2023-02-16T18:25:01.119+05:30 [Info] Using greedy index placement for index 1004987654
2023-02-16T18:25:01.121+05:30 [Info] Using greedy index placement for index 1005987654
2023-02-16T18:25:01.123+05:30 [Info] Using greedy index placement for index 1006987654
2023-02-16T18:25:01.124+05:30 [Info] Using greedy index placement for index 1007987654
2023-02-16T18:25:01.126+05:30 [Info] Using greedy index placement for index 1008987654
2023-02-16T18:25:01.128+05:30 [Info] Using greedy index placement for index 1009987654
2023-02-16T18:25:01.129+05:30 [Info] Using greedy index placement for index 1010987654
2023-02-16T18:25:01.131+05:30 [Info] Using greedy index placement for index 1011987654
2023-02-16T18:25:01.133+05:30 [Info] Using greedy index placement for index 1012987654
2023-02-16T18:25:01.134+05:30 [Info] Using greedy index placement for index 1013987654
2023-02-16T18:25:01.136+05:30 [Info] Using greedy index placement for index 1014987654
2023-02-16T18:25:01.138+05:30 [Info] Using greedy index placement for index 1015987654
2023-02-16T18:25:01.140+05:30 [Info] Using greedy index placement for index 1016987654
2023-02-16T18:25:01.141+05:30 [Info] Using greedy index placement for index 1017987654
2023-02-16T18:25:01.143+05:30 [Info] Using greedy index placement for index 1018987654
2023-02-16T18:25:01.145+05:30 [Info] Using greedy index placement for index 1019987654
2023-02-16T18:25:01.146+05:30 [Info] Using greedy index placement for index 1020987654
2023-02-16T18:25:01.148+05:30 [Info] Using greedy index placement for index 1021987654
2023-02-16T18:25:01.150+05:30 [Info] Using greedy index placement for index 1022987654
2023-02-16T18:25:01.152+05:30 [Info] Using greedy index placement for index 1023987654
2023-02-16T18:25:01.154+05:30 [Info] Using greedy index placement for index 1024987654
2023-02-16T18:25:01.155+05:30 [Info] Using greedy index placement for index 1025987654
2023-02-16T18:25:01.157+05:30 [Info] Using greedy index placement for index 1026987654
2023-02-16T18:25:01.159+05:30 [Info] Using greedy index placement for index 1027987654
2023-02-16T18:25:01.161+05:30 [Info] Using greedy index placement for index 1028987654
2023-02-16T18:25:01.163+05:30 [Info] Using greedy index placement for index 1029987654
2023-02-16T18:25:01.164+05:30 [Info] Using greedy index placement for index 1030987654
2023-02-16T18:25:01.166+05:30 [Info] Using greedy index placement for index 1031987654
2023-02-16T18:25:01.168+05:30 [Info] Using greedy index placement for index 1032987654
2023-02-16T18:25:01.170+05:30 [Info] Using greedy index placement for index 1033987654
2023-02-16T18:25:01.172+05:30 [Info] Using greedy index placement for index 1034987654
2023-02-16T18:25:01.173+05:30 [Info] Using greedy index placement for index 1035987654
2023-02-16T18:25:01.175+05:30 [Info] Using greedy index placement for index 1036987654
2023-02-16T18:25:01.177+05:30 [Info] Using greedy index placement for index 1037987654
2023-02-16T18:25:01.179+05:30 [Info] Using greedy index placement for index 1038987654
2023-02-16T18:25:01.181+05:30 [Info] Using greedy index placement for index 1039987654
2023-02-16T18:25:01.182+05:30 [Info] Using greedy index placement for index 1040987654
2023-02-16T18:25:01.184+05:30 [Info] Using greedy index placement for index 1041987654
2023-02-16T18:25:01.186+05:30 [Info] Using greedy index placement for index 1042987654
2023-02-16T18:25:01.188+05:30 [Info] Using greedy index placement for index 1043987654
2023-02-16T18:25:01.190+05:30 [Info] Using greedy index placement for index 1044987654
2023-02-16T18:25:01.192+05:30 [Info] Using greedy index placement for index 1045987654
2023-02-16T18:25:01.193+05:30 [Info] Using greedy index placement for index 1046987654
2023-02-16T18:25:01.195+05:30 [Info] Using greedy index placement for index 1047987654
2023-02-16T18:25:01.197+05:30 [Info] Using greedy index placement for index 1048987654
2023-02-16T18:25:01.199+05:30 [Info] Using greedy index placement for index 1049987654
2023-02-16T18:25:01.201+05:30 [Info] Using greedy index placement for index 1050987654
2023-02-16T18:25:01.203+05:30 [Info] Using greedy index placement for index 1051987654
2023-02-16T18:25:01.205+05:30 [Info] Using greedy index placement for index 1052987654
2023-02-16T18:25:01.207+05:30 [Info] Using greedy index placement for index 1053987654
2023-02-16T18:25:01.209+05:30 [Info] Using greedy index placement for index 1054987654
2023-02-16T18:25:01.211+05:30 [Info] Using greedy index placement for index 1055987654
2023-02-16T18:25:01.213+05:30 [Info] Using greedy index placement for index 1056987654
2023-02-16T18:25:01.214+05:30 [Info] Using greedy index placement for index 1057987654
2023-02-16T18:25:01.216+05:30 [Info] Using greedy index placement for index 1058987654
2023-02-16T18:25:01.218+05:30 [Info] Using greedy index placement for index 1059987654
2023-02-16T18:25:01.218+05:30 [Info] Actual variance of deferred index count across nodes is 8
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place 60 index instaces on 1 empty and 1 30 percent filled node - 1 SG
2023-02-16T18:25:01.221+05:30 [Info] Using greedy index placement for index 1000987654
2023-02-16T18:25:01.222+05:30 [Info] Using greedy index placement for index 1001987654
2023-02-16T18:25:01.224+05:30 [Info] Using greedy index placement for index 1002987654
2023-02-16T18:25:01.226+05:30 [Info] Using greedy index placement for index 1003987654
2023-02-16T18:25:01.227+05:30 [Info] Using greedy index placement for index 1004987654
2023-02-16T18:25:01.229+05:30 [Info] Using greedy index placement for index 1005987654
2023-02-16T18:25:01.231+05:30 [Info] Using greedy index placement for index 1006987654
2023-02-16T18:25:01.232+05:30 [Info] Using greedy index placement for index 1007987654
2023-02-16T18:25:01.234+05:30 [Info] Using greedy index placement for index 1008987654
2023-02-16T18:25:01.236+05:30 [Info] Using greedy index placement for index 1009987654
2023-02-16T18:25:01.237+05:30 [Info] Using greedy index placement for index 1010987654
2023-02-16T18:25:01.239+05:30 [Info] Using greedy index placement for index 1011987654
2023-02-16T18:25:01.241+05:30 [Info] Using greedy index placement for index 1012987654
2023-02-16T18:25:01.243+05:30 [Info] Using greedy index placement for index 1013987654
2023-02-16T18:25:01.244+05:30 [Info] Using greedy index placement for index 1014987654
2023-02-16T18:25:01.246+05:30 [Info] Using greedy index placement for index 1015987654
2023-02-16T18:25:01.248+05:30 [Info] Using greedy index placement for index 1016987654
2023-02-16T18:25:01.250+05:30 [Info] Using greedy index placement for index 1017987654
2023-02-16T18:25:01.252+05:30 [Info] Using greedy index placement for index 1018987654
2023-02-16T18:25:01.253+05:30 [Info] Using greedy index placement for index 1019987654
2023-02-16T18:25:01.255+05:30 [Info] Using greedy index placement for index 1020987654
2023-02-16T18:25:01.257+05:30 [Info] Using greedy index placement for index 1021987654
2023-02-16T18:25:01.261+05:30 [Info] Using greedy index placement for index 1022987654
2023-02-16T18:25:01.263+05:30 [Info] Using greedy index placement for index 1023987654
2023-02-16T18:25:01.265+05:30 [Info] Using greedy index placement for index 1024987654
2023-02-16T18:25:01.269+05:30 [Info] Using greedy index placement for index 1025987654
2023-02-16T18:25:01.271+05:30 [Info] Using greedy index placement for index 1026987654
2023-02-16T18:25:01.273+05:30 [Info] Using greedy index placement for index 1027987654
2023-02-16T18:25:01.275+05:30 [Info] Using greedy index placement for index 1028987654
2023-02-16T18:25:01.277+05:30 [Info] Using greedy index placement for index 1029987654
2023-02-16T18:25:01.279+05:30 [Info] Using greedy index placement for index 1030987654
2023-02-16T18:25:01.281+05:30 [Info] Using greedy index placement for index 1031987654
2023-02-16T18:25:01.282+05:30 [Info] Using greedy index placement for index 1032987654
2023-02-16T18:25:01.284+05:30 [Info] Using greedy index placement for index 1033987654
2023-02-16T18:25:01.286+05:30 [Info] Using greedy index placement for index 1034987654
2023-02-16T18:25:01.288+05:30 [Info] Using greedy index placement for index 1035987654
2023-02-16T18:25:01.290+05:30 [Info] Using greedy index placement for index 1036987654
2023-02-16T18:25:01.291+05:30 [Info] Using greedy index placement for index 1037987654
2023-02-16T18:25:01.293+05:30 [Info] Using greedy index placement for index 1038987654
2023-02-16T18:25:01.295+05:30 [Info] Using greedy index placement for index 1039987654
2023-02-16T18:25:01.297+05:30 [Info] Using greedy index placement for index 1040987654
2023-02-16T18:25:01.299+05:30 [Info] Using greedy index placement for index 1041987654
2023-02-16T18:25:01.301+05:30 [Info] Using greedy index placement for index 1042987654
2023-02-16T18:25:01.302+05:30 [Info] Using greedy index placement for index 1043987654
2023-02-16T18:25:01.304+05:30 [Info] Using greedy index placement for index 1044987654
2023-02-16T18:25:01.306+05:30 [Info] Using greedy index placement for index 1045987654
2023-02-16T18:25:01.308+05:30 [Info] Using greedy index placement for index 1046987654
2023-02-16T18:25:01.310+05:30 [Info] Using greedy index placement for index 1047987654
2023-02-16T18:25:01.312+05:30 [Info] Using greedy index placement for index 1048987654
2023-02-16T18:25:01.313+05:30 [Info] Using greedy index placement for index 1049987654
2023-02-16T18:25:01.315+05:30 [Info] Using greedy index placement for index 1050987654
2023-02-16T18:25:01.317+05:30 [Info] Using greedy index placement for index 1051987654
2023-02-16T18:25:01.319+05:30 [Info] Using greedy index placement for index 1052987654
2023-02-16T18:25:01.322+05:30 [Info] Using greedy index placement for index 1053987654
2023-02-16T18:25:01.323+05:30 [Info] Using greedy index placement for index 1054987654
2023-02-16T18:25:01.325+05:30 [Info] Using greedy index placement for index 1055987654
2023-02-16T18:25:01.327+05:30 [Info] Using greedy index placement for index 1056987654
2023-02-16T18:25:01.329+05:30 [Info] Using greedy index placement for index 1057987654
2023-02-16T18:25:01.331+05:30 [Info] Using greedy index placement for index 1058987654
2023-02-16T18:25:01.333+05:30 [Info] Using greedy index placement for index 1059987654
2023-02-16T18:25:01.333+05:30 [Info] Actual variance of deferred index count across nodes is 98
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place 5 index instaces on 1 empty and 1 60 percent filled node - 1 SG
2023-02-16T18:25:01.335+05:30 [Info] Using greedy index placement for index 1000987654
2023-02-16T18:25:01.337+05:30 [Info] Using greedy index placement for index 1001987654
2023-02-16T18:25:01.339+05:30 [Info] Using greedy index placement for index 1002987654
2023-02-16T18:25:01.341+05:30 [Info] Using greedy index placement for index 1003987654
2023-02-16T18:25:01.343+05:30 [Info] Using greedy index placement for index 1004987654
2023-02-16T18:25:01.343+05:30 [Info] Actual variance of deferred index count across nodes is 4.5
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place 60 index instaces on 1 empty and 1 60 percent filled node - 1 SG
2023-02-16T18:25:01.345+05:30 [Info] Using greedy index placement for index 1000987654
2023-02-16T18:25:01.347+05:30 [Info] Using greedy index placement for index 1001987654
2023-02-16T18:25:01.349+05:30 [Info] Using greedy index placement for index 1002987654
2023-02-16T18:25:01.350+05:30 [Info] Using greedy index placement for index 1003987654
2023-02-16T18:25:01.352+05:30 [Info] Using greedy index placement for index 1004987654
2023-02-16T18:25:01.354+05:30 [Info] Using greedy index placement for index 1005987654
2023-02-16T18:25:01.355+05:30 [Info] Using greedy index placement for index 1006987654
2023-02-16T18:25:01.357+05:30 [Info] Using greedy index placement for index 1007987654
2023-02-16T18:25:01.359+05:30 [Info] Using greedy index placement for index 1008987654
2023-02-16T18:25:01.360+05:30 [Info] Using greedy index placement for index 1009987654
2023-02-16T18:25:01.362+05:30 [Info] Using greedy index placement for index 1010987654
2023-02-16T18:25:01.364+05:30 [Info] Using greedy index placement for index 1011987654
2023-02-16T18:25:01.366+05:30 [Info] Using greedy index placement for index 1012987654
2023-02-16T18:25:01.368+05:30 [Info] Using greedy index placement for index 1013987654
2023-02-16T18:25:01.369+05:30 [Info] Using greedy index placement for index 1014987654
2023-02-16T18:25:01.371+05:30 [Info] Using greedy index placement for index 1015987654
2023-02-16T18:25:01.373+05:30 [Info] Using greedy index placement for index 1016987654
2023-02-16T18:25:01.375+05:30 [Info] Using greedy index placement for index 1017987654
2023-02-16T18:25:01.377+05:30 [Info] Using greedy index placement for index 1018987654
2023-02-16T18:25:01.379+05:30 [Info] Using greedy index placement for index 1019987654
2023-02-16T18:25:01.381+05:30 [Info] Using greedy index placement for index 1020987654
2023-02-16T18:25:01.382+05:30 [Info] Using greedy index placement for index 1021987654
2023-02-16T18:25:01.384+05:30 [Info] Using greedy index placement for index 1022987654
2023-02-16T18:25:01.386+05:30 [Info] Using greedy index placement for index 1023987654
2023-02-16T18:25:01.388+05:30 [Info] Using greedy index placement for index 1024987654
2023-02-16T18:25:01.390+05:30 [Info] Using greedy index placement for index 1025987654
2023-02-16T18:25:01.391+05:30 [Info] Using greedy index placement for index 1026987654
2023-02-16T18:25:01.393+05:30 [Info] Using greedy index placement for index 1027987654
2023-02-16T18:25:01.395+05:30 [Info] Using greedy index placement for index 1028987654
2023-02-16T18:25:01.397+05:30 [Info] Using greedy index placement for index 1029987654
2023-02-16T18:25:01.399+05:30 [Info] Using greedy index placement for index 1030987654
2023-02-16T18:25:01.400+05:30 [Info] Using greedy index placement for index 1031987654
2023-02-16T18:25:01.402+05:30 [Info] Using greedy index placement for index 1032987654
2023-02-16T18:25:01.404+05:30 [Info] Using greedy index placement for index 1033987654
2023-02-16T18:25:01.406+05:30 [Info] Using greedy index placement for index 1034987654
2023-02-16T18:25:01.407+05:30 [Info] Using greedy index placement for index 1035987654
2023-02-16T18:25:01.409+05:30 [Info] Using greedy index placement for index 1036987654
2023-02-16T18:25:01.411+05:30 [Info] Using greedy index placement for index 1037987654
2023-02-16T18:25:01.413+05:30 [Info] Using greedy index placement for index 1038987654
2023-02-16T18:25:01.414+05:30 [Info] Using greedy index placement for index 1039987654
2023-02-16T18:25:01.416+05:30 [Info] Using greedy index placement for index 1040987654
2023-02-16T18:25:01.418+05:30 [Info] Using greedy index placement for index 1041987654
2023-02-16T18:25:01.420+05:30 [Info] Using greedy index placement for index 1042987654
2023-02-16T18:25:01.422+05:30 [Info] Using greedy index placement for index 1043987654
2023-02-16T18:25:01.424+05:30 [Info] Using greedy index placement for index 1044987654
2023-02-16T18:25:01.426+05:30 [Info] Using greedy index placement for index 1045987654
2023-02-16T18:25:01.428+05:30 [Info] Using greedy index placement for index 1046987654
2023-02-16T18:25:01.429+05:30 [Info] Using greedy index placement for index 1047987654
2023-02-16T18:25:01.431+05:30 [Info] Using greedy index placement for index 1048987654
2023-02-16T18:25:01.433+05:30 [Info] Using greedy index placement for index 1049987654
2023-02-16T18:25:01.435+05:30 [Info] Using greedy index placement for index 1050987654
2023-02-16T18:25:01.437+05:30 [Info] Using greedy index placement for index 1051987654
2023-02-16T18:25:01.439+05:30 [Info] Using greedy index placement for index 1052987654
2023-02-16T18:25:01.440+05:30 [Info] Using greedy index placement for index 1053987654
2023-02-16T18:25:01.442+05:30 [Info] Using greedy index placement for index 1054987654
2023-02-16T18:25:01.444+05:30 [Info] Using greedy index placement for index 1055987654
2023-02-16T18:25:01.446+05:30 [Info] Using greedy index placement for index 1056987654
2023-02-16T18:25:01.448+05:30 [Info] Using greedy index placement for index 1057987654
2023-02-16T18:25:01.449+05:30 [Info] Using greedy index placement for index 1058987654
2023-02-16T18:25:01.452+05:30 [Info] Using greedy index placement for index 1059987654
2023-02-16T18:25:01.452+05:30 [Info] Actual variance of deferred index count across nodes is 648
--- PASS: TestGreedyPlanner (0.49s)
=== RUN   TestTenantAwarePlanner
2023/02/16 18:25:01 In TestTenantAwarePlanner()
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 1 empty node - 1 SG
2023-02-16T18:25:01.453+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001]]
2023-02-16T18:25:01.453+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.453+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 empty nodes - 2 SG
2023-02-16T18:25:01.455+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9004 127.0.0.1:9001] [127.0.0.1:9003 127.0.0.1:9002]]
2023-02-16T18:25:01.455+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.455+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9003 127.0.0.1:9002]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 1 node - 1 SG
2023-02-16T18:25:01.457+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.457+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001]]
2023-02-16T18:25:01.457+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001]
2023-02-16T18:25:01.457+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 2 nodes - 1 SG
2023-02-16T18:25:01.459+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.459+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.459+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001] [127.0.0.1:9002]]
2023-02-16T18:25:01.459+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001]
2023-02-16T18:25:01.459+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 nodes - 2 SG - Tenant Affinity(a)
2023-02-16T18:25:01.462+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.462+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.462+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9002] [127.0.0.1:9004 127.0.0.1:9003]]
2023-02-16T18:25:01.462+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001 127.0.0.1:9002]
2023-02-16T18:25:01.462+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001 127.0.0.1:9002]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 nodes - 2 SG - Tenant Affinity(b)
2023-02-16T18:25:01.464+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.464+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg3 Memory 300000000 Units 1000
2023-02-16T18:25:01.464+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9002]]
2023-02-16T18:25:01.464+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.464+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001 127.0.0.1:9004]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 nodes - 2 SG - Tenant Affinity(c)
2023-02-16T18:25:01.466+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.466+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.466+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.466+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg3 Memory 300000000 Units 1000
2023-02-16T18:25:01.466+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9003]]
2023-02-16T18:25:01.466+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.466+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9001 127.0.0.1:9004]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - Tenant Affinity Memory Above LWM
2023-02-16T18:25:01.469+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.469+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 700000000 Units 1000
2023-02-16T18:25:01.469+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.469+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 700000000 Units 1000
2023-02-16T18:25:01.469+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.469+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.469+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9002 127.0.0.1:9005]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - Tenant Affinity Units Above LWM
2023-02-16T18:25:01.471+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.471+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 5000
2023-02-16T18:25:01.471+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.471+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 300000000 Units 5000
2023-02-16T18:25:01.471+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.471+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.471+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9002 127.0.0.1:9005]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - Tenant Affinity New Tenant(a)
2023-02-16T18:25:01.473+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.474+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 700000000 Units 1000
2023-02-16T18:25:01.474+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.474+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 700000000 Units 1000
2023-02-16T18:25:01.474+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.474+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.474+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - Tenant Affinity New Tenant(b)
2023-02-16T18:25:01.476+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.476+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 700000000 Units 1000
2023-02-16T18:25:01.476+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.476+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 700000000 Units 1000
2023-02-16T18:25:01.476+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.476+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.476+05:30 [Info] Planner::executeTenantAwarePlan Found Result [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 2 empty nodes - 1 SG
2023-02-16T18:25:01.478+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001] [127.0.0.1:9002]]
2023-02-16T18:25:01.478+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.478+05:30 [Info] Planner::filterPartialSubClusters Filter partial subcluster [127.0.0.1:9002]
2023-02-16T18:25:01.478+05:30 [Info] Planner::filterPartialSubClusters Filter partial subcluster [127.0.0.1:9001]
2023-02-16T18:25:01.478+05:30 [Info] Planner::executeTenantAwarePlan Found no matching candidate for tenant &{idx1 bucket1   987654 false [name1]  false false false false 0  0 [] 2 []   false 0 0 0 0 0 0 0 0}. Error - Unable to find any valid SubCluster
2023/02/16 18:25:01 Expected error Planner not able to find any node for placement - Unable to find any valid SubCluster
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 nodes - 2 SG - Tenant Affinity Above Memory HWM
2023-02-16T18:25:01.478+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 1000
2023-02-16T18:25:01.478+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.478+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.478+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg3 Memory 900000000 Units 1000
2023-02-16T18:25:01.478+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9003]]
2023-02-16T18:25:01.479+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.479+05:30 [Info] Planner::executeTenantAwarePlan Found no matching candidate for tenant &{idx1 bucket1   987654 false [name1]  false false false false 0  0 [] 2 []   false 0 0 0 0 0 0 0 0}. Error - Tenant SubCluster Above High Usage Threshold
2023/02/16 18:25:01 Expected error Planner not able to find any node for placement - Tenant SubCluster Above High Usage Threshold
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 4 nodes - 2 SG - Tenant Affinity Above Units HWM
2023-02-16T18:25:01.479+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 8000
2023-02-16T18:25:01.479+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.479+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.479+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg3 Memory 300000000 Units 8000
2023-02-16T18:25:01.479+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9003]]
2023-02-16T18:25:01.479+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.479+05:30 [Info] Planner::executeTenantAwarePlan Found no matching candidate for tenant &{idx1 bucket1   987654 false [name1]  false false false false 0  0 [] 2 []   false 0 0 0 0 0 0 0 0}. Error - Tenant SubCluster Above High Usage Threshold
2023/02/16 18:25:01 Expected error Planner not able to find any node for placement - Tenant SubCluster Above High Usage Threshold
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - New Tenant Memory Above LWM
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 680000000 Units 1000
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 700000000 Units 1000
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 680000000 Units 1000
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 700000000 Units 1000
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found no matching candidate for tenant &{idx1 bucket7   987654 false [name1]  false false false false 0  0 [] 2 []   false 0 0 0 0 0 0 0 0}. Error - No SubCluster Below Low Usage Threshold
2023/02/16 18:25:01 Expected error Planner not able to find any node for placement - No SubCluster Below Low Usage Threshold
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Place Single Index Instance - 6 nodes - 3 SG - New Tenant Units Above LWM
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 5000
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 5500
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 5000
2023-02-16T18:25:01.480+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 300000000 Units 5500
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found Candidate Based on Tenant Affinity []
2023-02-16T18:25:01.480+05:30 [Info] Planner::executeTenantAwarePlan Found no matching candidate for tenant &{idx1 bucket7   987654 false [name1]  false false false false 0  0 [] 2 []   false 0 0 0 0 0 0 0 0}. Error - No SubCluster Below Low Usage Threshold
2023/02/16 18:25:01 Expected error Planner not able to find any node for placement - No SubCluster Below Low Usage Threshold
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, 1 empty, 1 Memory Above HWM
2023-02-16T18:25:01.482+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 1000
2023-02-16T18:25:01.482+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 900000000 Units 1000
2023-02-16T18:25:01.482+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.482+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 900000000 Units 1000
2023-02-16T18:25:01.482+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.482+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.482+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.482+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 400000000 UnitsUsage 500 
2023-02-16T18:25:01.482+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.482+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 400000000 UnitsUsage 500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, 1 empty, 1 Units Above HWM
2023-02-16T18:25:01.483+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 8000
2023-02-16T18:25:01.483+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 400000000 Units 1000
2023-02-16T18:25:01.483+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 8000
2023-02-16T18:25:01.483+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 400000000 Units 1000
2023-02-16T18:25:01.483+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.483+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.483+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.483+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 3000 
2023-02-16T18:25:01.483+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.483+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, 1 empty, Both Memory/Units Above HWM
2023-02-16T18:25:01.484+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.484+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.484+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.484+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 300000000 Units 1000
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.484+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.484+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.484+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, Multiple tenants to move, single source, multiple destination
2023-02-16T18:25:01.485+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.485+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.485+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.485+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 200000000 Units 1000
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 4000 
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 400000000 UnitsUsage 2000 
2023-02-16T18:25:01.485+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.485+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 4000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.485+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 400000000 UnitsUsage 2000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, Multiple tenants to move, no nodes below LWM
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 300000000 Units 5000
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 900000000 Units 3000
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 700000000 Units 3000
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 300000000 Units 5000
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 900000000 Units 3000
2023-02-16T18:25:01.486+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 700000000 Units 3000
2023-02-16T18:25:01.486+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.486+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.486+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.486+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 300000000 UnitsUsage 500 
2023-02-16T18:25:01.486+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket1 MemoryUsage 200000000 UnitsUsage 500 
2023-02-16T18:25:01.487+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Multiple tenants to move, multiple source, multiple destination(non-uniform memory/units usage)
2023-02-16T18:25:01.487+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.488+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.488+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.488+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.488+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.488+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 1200 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 800 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 1500 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 500 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 500 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 700 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 1000 
2023-02-16T18:25:01.488+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 1200  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 800  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 1500  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 700  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 500  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.488+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 1000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Multiple tenants to move, multiple source, multiple destination(non-uniform memory/units usage)
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.489+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800 
2023-02-16T18:25:01.489+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000 
2023-02-16T18:25:01.490+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500 
2023-02-16T18:25:01.490+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.490+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, Single Large Tenant, Nothing to move
2023-02-16T18:25:01.491+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 1000
2023-02-16T18:25:01.491+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 900000000 Units 2000
2023-02-16T18:25:01.491+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 1000
2023-02-16T18:25:01.491+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 900000000 Units 2000
2023-02-16T18:25:01.491+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.491+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.491+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.491+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 200000000 UnitsUsage 1000 
2023-02-16T18:25:01.491+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.491+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 200000000 UnitsUsage 1000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Multiple tenants to move, multiple source, multiple destination(zero usage tenants)
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.492+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket15 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket16 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.493+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket15 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.493+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket16 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 SG, 1 Partial Subcluster
2023-02-16T18:25:01.496+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 100000000 Units 1000
2023-02-16T18:25:01.496+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 900000000 Units 3000
2023-02-16T18:25:01.496+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.496+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 900000000 Units 3000
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003]]
2023-02-16T18:25:01.496+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9003] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 300000000 UnitsUsage 500 
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket1 MemoryUsage 200000000 UnitsUsage 500 
2023-02-16T18:25:01.496+05:30 [Info] Planner::filterPartialSubClusters Filter partial subcluster [127.0.0.1:9003]
2023-02-16T18:25:01.496+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.496+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 300000000 UnitsUsage 500  can be placed on [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.496+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket1 MemoryUsage 200000000 UnitsUsage 500  can be placed on [127.0.0.1:9001 127.0.0.1:9004]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Replica Repair - 4 SG, Missing Replicas for multiple tenants in SG
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 100000000 Units 1000
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.497+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.497+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.497+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx2,1) on 127.0.0.1:9004
2023-02-16T18:25:01.497+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,0) on 127.0.0.1:9001
2023-02-16T18:25:01.497+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket8,,,idx1,1) on 127.0.0.1:9005
2023-02-16T18:25:01.497+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket8,,,idx1,1) on 127.0.0.1:9005
2023-02-16T18:25:01.497+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Replica Repair - 4 SG, Missing Replicas, Buddy Node Failed over
2023-02-16T18:25:01.498+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 100000000 Units 1000
2023-02-16T18:25:01.498+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.498+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.498+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.498+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.498+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9003]]
2023-02-16T18:25:01.498+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket1,,,idx1,1) on 127.0.0.1:9004
2023-02-16T18:25:01.498+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx2,1) on 127.0.0.1:9004
2023-02-16T18:25:01.498+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket3,,,idx1,1) on 127.0.0.1:9004
2023-02-16T18:25:01.498+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket8,,,idx1,1) on 127.0.0.1:9005
2023-02-16T18:25:01.498+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket8,,,idx1,1) on 127.0.0.1:9005
2023-02-16T18:25:01.498+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9003] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.498+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Replica Repair - 4 SG, Missing Replicas, Buddy Node Failed over, No replacement
2023-02-16T18:25:01.499+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 100000000 Units 1000
2023-02-16T18:25:01.499+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.499+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.499+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.499+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.499+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.499+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9001] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.499+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Replica Repair - 4 SG, Missing Replicas, one replica missing with pendingDelete true 
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 100000000 Units 1000
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.500+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.500+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.500+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx2,1) on 127.0.0.1:9004
2023-02-16T18:25:01.500+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,0) on 127.0.0.1:9001
2023-02-16T18:25:01.501+05:30 [Info] Planner::findMissingReplicaForIndexerNode Skipping Replica Repair for 88883:81813:0. PendingDelete true
2023-02-16T18:25:01.501+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket8,,,idx2,1) on 127.0.0.1:9005
2023-02-16T18:25:01.501+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Replica Repair - 2 SG, Missing Replicas with Nodes over HWM
2023-02-16T18:25:01.502+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 7000
2023-02-16T18:25:01.502+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 500
2023-02-16T18:25:01.502+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 7000
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.502+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket9,,,idx2,1) on 127.0.0.1:9005
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 800 
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 700 
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 500 
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 1000 
2023-02-16T18:25:01.502+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.502+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 800  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.502+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 700  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.502+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 500  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.502+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 1000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 4 SG, Swap 1 node each from 2 SG with 2 new nodes
2023-02-16T18:25:01.503+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9002 127.0.0.1:9004]
2023-02-16T18:25:01.503+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9002 127.0.0.1:9004]
2023-02-16T18:25:01.503+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9005 127.0.0.1:9001]
2023-02-16T18:25:01.503+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9003 127.0.0.1:9006 127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.503+05:30 [Info] Planner::moveTenantsFromDeletedNodes selected newNodes for swap [127.0.0.1:9003 127.0.0.1:9008 127.0.0.1:9006 127.0.0.1:9007]
2023-02-16T18:25:01.503+05:30 [Info] Moving index 7777:7171:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 8888:8181:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 9999:9191:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 101010:101101:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 111111:112112:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 121212:121121:0 from source 127.0.0.1:9002 to dest 127.0.0.1:9003
2023-02-16T18:25:01.503+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Moving index 5555:55151:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Moving index 6666:66161:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.503+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9002 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.503+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.503+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.504+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 1000000000 Units 7000
2023-02-16T18:25:01.504+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.504+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg3 Memory 1100000000 Units 8100
2023-02-16T18:25:01.504+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9008] [127.0.0.1:9003 127.0.0.1:9005] [127.0.0.1:9006] [127.0.0.1:9007]]
2023-02-16T18:25:01.504+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9006] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.504+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9007] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.504+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 4 SG, Swap 1 node each from 2 SG with 2 new nodes(different SG)
2023-02-16T18:25:01.505+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9004 127.0.0.1:9005]
2023-02-16T18:25:01.505+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9004 127.0.0.1:9005]
2023-02-16T18:25:01.505+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9001 127.0.0.1:9002]
2023-02-16T18:25:01.505+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.505+05:30 [Info] Planner::moveTenantsFromDeletedNodes selected newNodes for swap [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.505+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 5555:55151:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 6666:66161:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.505+05:30 [Info] Moving index 7777:17171:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Moving index 8888:18181:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Moving index 9999:19191:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Moving index 101010:1101101:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Moving index 111111:1112112:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Moving index 121212:1121121:0 from source 127.0.0.1:9005 to dest 127.0.0.1:9003
2023-02-16T18:25:01.505+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.505+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9005 SG sg3 Memory 0 Units 0
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 1000000000 Units 7000
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 900000000 Units 8000
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.505+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.505+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9006] [127.0.0.1:9002 127.0.0.1:9003] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.505+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 4 SG, Swap 1 SG with 2 new nodes
2023-02-16T18:25:01.506+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.506+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.506+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.506+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.506+05:30 [Info] Planner::moveTenantsFromDeletedNodes selected newNodes for swap [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.506+05:30 [Info] Moving index 1111:1212:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 2222:2121:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 3333:3131:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 4444:4141:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 5555:5151:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 6666:6161:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9003
2023-02-16T18:25:01.506+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Moving index 5555:55151:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Moving index 6666:66161:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.506+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9001 SG sg1 Memory 0 Units 0
2023-02-16T18:25:01.506+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 900000000 Units 8000
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.506+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.506+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.506+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 4 SG, Swap 1 node with 2 new nodes
2023-02-16T18:25:01.507+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9004]
2023-02-16T18:25:01.508+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9004]
2023-02-16T18:25:01.508+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9001]
2023-02-16T18:25:01.508+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9003 127.0.0.1:9007 127.0.0.1:9008 127.0.0.1:9006]
2023-02-16T18:25:01.508+05:30 [Info] Planner::moveTenantsFromDeletedNodes selected newNodes for swap [127.0.0.1:9006 127.0.0.1:9003 127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.508+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Moving index 5555:55151:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Moving index 6666:66161:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9006
2023-02-16T18:25:01.508+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.508+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.508+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.508+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.508+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 900000000 Units 8000
2023-02-16T18:25:01.508+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9006] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003] [127.0.0.1:9008] [127.0.0.1:9007]]
2023-02-16T18:25:01.508+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9003] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.508+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9008] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.508+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9007] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.508+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 4 SG, Swap 1 empty node with 2 new nodes
2023-02-16T18:25:01.509+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9004]
2023-02-16T18:25:01.509+05:30 [Info] Planner::moveTenantsFromDeletedNodes No non-empty deleted nodes found.
2023-02-16T18:25:01.509+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.509+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.509+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.509+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.509+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.509+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.509+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9006] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9003]]
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket1,,,idx1,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx2,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket3,,,idx1,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket5,,,idx1,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket6,,,idx2,1) on 127.0.0.1:9006
2023-02-16T18:25:01.509+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9003] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.509+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 1 SG, Swap 1 node - Failed swap rebalance
2023-02-16T18:25:01.510+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9004]
2023-02-16T18:25:01.510+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9004]
2023-02-16T18:25:01.510+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9001]
2023-02-16T18:25:01.510+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.510+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 1 is more than num new/empty nodes 0
2023-02-16T18:25:01.510+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.510+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.510+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg3 Memory 200000000 Units 100
2023-02-16T18:25:01.511+05:30 [Info] Planner::moveTenantsFromDeletedNodes Considering 127.0.0.1:9008 as replacement node for deleted node 127.0.0.1:9004.
2023-02-16T18:25:01.511+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.511+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.511+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.511+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.511+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 600000000 Units 4000
2023-02-16T18:25:01.511+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.511+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg3 Memory 500000000 Units 4100
2023-02-16T18:25:01.511+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9008]]
2023-02-16T18:25:01.511+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Swap Rebalance - 1 SG, Swap 2 node - server group mismatch
2023-02-16T18:25:01.511+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9002]
2023-02-16T18:25:01.511+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9002]
2023-02-16T18:25:01.511+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9002 127.0.0.1:9001]
2023-02-16T18:25:01.511+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9003 127.0.0.1:9004]
2023/02/16 18:25:01 Expected error Planner - Unable to satisfy server group constraint while replacing removed nodes with new nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, Enough Capacity
2023-02-16T18:25:01.514+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 350000000 UnitsUsage 2000 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket6 MemoryUsage 250000000 UnitsUsage 2000 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.514+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM [[127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 350000000 UnitsUsage 2000  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket6 MemoryUsage 250000000 UnitsUsage 2000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.514+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9001 SG sg1 Memory 0 Units 0
2023-02-16T18:25:01.514+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.514+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 600000000 Units 4500
2023-02-16T18:25:01.515+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.515+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 600000000 Units 4500
2023-02-16T18:25:01.515+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 600000000 Units 4100
2023-02-16T18:25:01.515+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 600000000 Units 4100
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket15 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket16 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.515+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, Not Enough Capacity
2023-02-16T18:25:01.516+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket6 MemoryUsage 300000000 UnitsUsage 2000 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 300000000 UnitsUsage 2000 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.516+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM [[127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.516+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket6 MemoryUsage 300000000 UnitsUsage 2000  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.516+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 300000000 UnitsUsage 2000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500  on any target
2023-02-16T18:25:01.516+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800  on any target
2023-02-16T18:25:01.516+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500  on any target
2023-02-16T18:25:01.516+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.516+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.516+05:30 [Error] Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 Expected error Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 node, Pair node not deleted(Failed swap rebalance of 1 node)
2023-02-16T18:25:01.518+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001]
2023-02-16T18:25:01.518+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001]
2023-02-16T18:25:01.518+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004]
2023-02-16T18:25:01.518+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.518+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 1 is more than num new/empty nodes 0
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9009 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.518+05:30 [Info] Planner::moveTenantsFromDeletedNodes Considering 127.0.0.1:9009 as replacement node for deleted node 127.0.0.1:9001.
2023-02-16T18:25:01.518+05:30 [Info] Moving index 1111:1212:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9009
2023-02-16T18:25:01.518+05:30 [Info] Moving index 2222:2121:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9009
2023-02-16T18:25:01.518+05:30 [Info] Moving index 3333:3131:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9009
2023-02-16T18:25:01.518+05:30 [Info] Moving index 4444:4141:0 from source 127.0.0.1:9001 to dest 127.0.0.1:9009
2023-02-16T18:25:01.518+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9001 SG sg1 Memory 600000000 Units 4000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.518+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9009 SG sg1 Memory 500000000 Units 4100
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9004 127.0.0.1:9009] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9004 127.0.0.1:9009]]
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket15 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket16 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9004 127.0.0.1:9009]
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket17 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket14 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.518+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9003 127.0.0.1:9006] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket10 MemoryUsage 200000000 UnitsUsage 800  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket7 MemoryUsage 25000000 UnitsUsage 500  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket9 MemoryUsage 100000000 UnitsUsage 700  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket15 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket8 MemoryUsage 75000000 UnitsUsage 1000  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.518+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket17 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.519+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket16 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.519+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket14 MemoryUsage 0 UnitsUsage 0  can be placed on [127.0.0.1:9003 127.0.0.1:9006]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 node, Pair node already deleted
2023-02-16T18:25:01.519+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9002]
2023-02-16T18:25:01.519+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9002]
2023-02-16T18:25:01.519+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes []
2023-02-16T18:25:01.519+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.519+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 1 is more than num new/empty nodes 0
2023-02-16T18:25:01.519+05:30 [Info] Planner::moveTenantsFromDeletedNodes Pair node not found for deleted node 127.0.0.1:9002.
2023-02-16T18:25:01.519+05:30 [Error] Planner - Pair node for 127.0.0.1:9002 not found. Provide additional node as replacement.
2023/02/16 18:25:01 Expected error Planner - Pair node for 127.0.0.1:9002 not found. Provide additional node as replacement.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, empty nodes
2023-02-16T18:25:01.521+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9005 127.0.0.1:9007 127.0.0.1:9008]
2023-02-16T18:25:01.521+05:30 [Info] Planner::moveTenantsFromDeletedNodes No non-empty deleted nodes found.
2023-02-16T18:25:01.521+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9005 SG sg3 Memory 0 Units 0
2023-02-16T18:25:01.521+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.521+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.521+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 800000000 Units 8000
2023-02-16T18:25:01.521+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 800000000 Units 7000
2023-02-16T18:25:01.521+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.521+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 800000000 Units 8000
2023-02-16T18:25:01.521+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.521+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.521+05:30 [Info] Planner::repairMissingReplica Found SubCluster [127.0.0.1:9002] with len 1. Skipping replica repair attempt.
2023-02-16T18:25:01.521+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, Deleted Nodes more than added nodes
2023-02-16T18:25:01.521+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.521+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.521+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.521+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9008]
2023-02-16T18:25:01.521+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 1
2023-02-16T18:25:01.521+05:30 [Error] Planner - Number of non-empty deleted nodes cannot be greater than number of added nodes.
2023/02/16 18:25:01 Expected error Planner - Number of non-empty deleted nodes cannot be greater than number of added nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 node, Add 1 node in, server group mismatch
2023-02-16T18:25:01.522+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9006]
2023-02-16T18:25:01.522+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9006]
2023-02-16T18:25:01.522+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9003]
2023-02-16T18:25:01.522+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9007]
2023/02/16 18:25:01 Expected error Planner - Unable to satisfy server group constraint while replacing removed nodes with new nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 node, Pair node exists
2023-02-16T18:25:01.523+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9004]
2023-02-16T18:25:01.523+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9004]
2023-02-16T18:25:01.523+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9001]
2023-02-16T18:25:01.523+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.523+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 1 is more than num new/empty nodes 0
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 500
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.523+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.523+05:30 [Info] Planner::moveTenantsFromDeletedNodes No replacement node found for deleted node 127.0.0.1:9004.
2023-02-16T18:25:01.523+05:30 [Error] Planner - Removing node 127.0.0.1:9004 will result in losing indexes. Provide additional node as replacement.
2023/02/16 18:25:01 Expected error Planner - Removing node 127.0.0.1:9004 will result in losing indexes. Provide additional node as replacement.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, No nodes under LWM
2023-02-16T18:25:01.524+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket5 MemoryUsage 300000000 UnitsUsage 2000 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket6 MemoryUsage 300000000 UnitsUsage 2000 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 120000000 UnitsUsage 1500 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket3 MemoryUsage 80000000 UnitsUsage 1200 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 70000000 UnitsUsage 800 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 30000000 UnitsUsage 500 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket14 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket17 MemoryUsage 0 UnitsUsage 0 
2023-02-16T18:25:01.524+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.524+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.524+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.524+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.524+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM []
2023-02-16T18:25:01.524+05:30 [Error] Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 Expected error Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 4 SG, Move out 1 subcluster, Not Enough Capacity, Partial Subcluster
2023-02-16T18:25:01.525+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket1 MemoryUsage 100000000 UnitsUsage 500 
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 100000000 UnitsUsage 500 
2023-02-16T18:25:01.525+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9001 SG sg1 Memory 200000000 Units 1000
2023-02-16T18:25:01.525+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.525+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9004 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.525+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.525+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 500
2023-02-16T18:25:01.525+05:30 [Info] Planner::filterPartialSubClusters Filter partial subcluster [127.0.0.1:9006]
2023-02-16T18:25:01.525+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM []
2023-02-16T18:25:01.525+05:30 [Error] Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 Expected error Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 SG, Move out 1 non-empty and 1 empty  node
2023-02-16T18:25:01.526+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9007 127.0.0.1:9004]
2023-02-16T18:25:01.526+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9004]
2023-02-16T18:25:01.526+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9001]
2023-02-16T18:25:01.526+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes [127.0.0.1:9008]
2023-02-16T18:25:01.526+05:30 [Info] Planner::moveTenantsFromDeletedNodes selected newNodes for swap [127.0.0.1:9008]
2023-02-16T18:25:01.526+05:30 [Info] Moving index 1111:11212:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 2222:22121:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 3333:33131:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 4444:44141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 5555:55151:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 6666:66161:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 171717:1171171:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Moving index 141414:1141141:0 from source 127.0.0.1:9004 to dest 127.0.0.1:9008
2023-02-16T18:25:01.526+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9007 SG sg1 Memory 0 Units 0
2023-02-16T18:25:01.526+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9004 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.526+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 800000000 Units 8000
2023-02-16T18:25:01.526+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 800000000 Units 8000
2023-02-16T18:25:01.526+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9008 127.0.0.1:9001]]
2023-02-16T18:25:01.526+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, 1 empty, 1 Above HWM
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.527+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.527+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.527+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  on any target
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 600000000 Units 6000
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 3000
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 600000000 Units 6000
2023-02-16T18:25:01.527+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 300000000 Units 3000
2023-02-16T18:25:01.527+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.527+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9003 127.0.0.1:9006]]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, 1 below LWM, 1 above HWM
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 300000000 Units 1000
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.528+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.528+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.528+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  on any target
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 600000000 Units 6000
2023-02-16T18:25:01.528+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 600000000 Units 4000
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 600000000 Units 6000
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 600000000 Units 4000
2023-02-16T18:25:01.529+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.529+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 3 Subclusters, 1 empty, 1 Above HWM, 1 below LWM
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 300000000 Units 1000
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.529+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 300000000 Units 1000
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.530+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.530+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.530+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 500000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 500000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 300000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.530+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.530+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9003 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 500000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 500000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9006 SG sg3 Memory 300000000 Units 3000
2023-02-16T18:25:01.530+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM []
2023-02-16T18:25:01.530+05:30 [Error] Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, 1 above LWM/below HWM, 1 empty
2023-02-16T18:25:01.531+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 700000000 Units 5000
2023-02-16T18:25:01.531+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 700000000 Units 5000
2023-02-16T18:25:01.531+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.531+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, Both above LWM/below HWM
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 700000000 Units 5000
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 650000000 Units 4500
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 700000000 Units 5000
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 650000000 Units 4500
2023-02-16T18:25:01.532+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.532+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 700000000 Units 5000
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 650000000 Units 4500
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 700000000 Units 5000
2023-02-16T18:25:01.532+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 650000000 Units 4500
2023-02-16T18:25:01.532+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.532+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, 1 empty, 1 Above HWM (partial replica repair)
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 9000
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 550000000 Units 5500
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.533+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,0) on 127.0.0.1:9004
2023-02-16T18:25:01.533+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket5,,,idx2,0) on 127.0.0.1:9004
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.533+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.533+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.533+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  on any target
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 600000000 Units 6000
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 3000
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 600000000 Units 6000
2023-02-16T18:25:01.533+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 300000000 Units 3000
2023-02-16T18:25:01.533+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.533+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9003 127.0.0.1:9006]]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 2 Subclusters, 1 empty, 1 Above HWM(full replica repair)
2023-02-16T18:25:01.534+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 9000
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9004 127.0.0.1:9001] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx1,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket2,,,idx2,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx1,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket5,,,idx1,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket5,,,idx2,1) on 127.0.0.1:9001
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9004 127.0.0.1:9001]]
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9004 127.0.0.1:9001]
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000 
2023-02-16T18:25:01.534+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.534+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.534+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9004 TenantId bucket2 MemoryUsage 200000000 UnitsUsage 2000  on any target
2023-02-16T18:25:01.534+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 600000000 Units 6000
2023-02-16T18:25:01.534+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 300000000 Units 3000
2023-02-16T18:25:01.534+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 600000000 Units 6000
2023-02-16T18:25:01.534+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 300000000 Units 3000
2023-02-16T18:25:01.534+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.534+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9003 127.0.0.1:9006]]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 ScaleIn- 2 Subclusters, Both below LWM, Positive Case
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.535+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.535+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.535+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9003 127.0.0.1:9006]
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9003 TenantId bucket3 MemoryUsage 100000000 UnitsUsage 2000 
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9003 SG sg1 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 200000000 Units 1000
2023-02-16T18:25:01.535+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9006 SG sg3 Memory 100000000 Units 2000
2023-02-16T18:25:01.535+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005]]
2023-02-16T18:25:01.535+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9003 TenantId bucket3 MemoryUsage 100000000 UnitsUsage 2000  can be placed on [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.535+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9003 SG sg1 Memory 0 Units 0
2023-02-16T18:25:01.535+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9006 SG sg3 Memory 0 Units 0
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 ScaleIn- 2 Subclusters, One below LWM/ 1 Empty
2023-02-16T18:25:01.536+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 200000000 Units 1000
2023-02-16T18:25:01.536+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 200000000 Units 1000
2023-02-16T18:25:01.536+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.536+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 ScaleIn- 3 Subclusters, One above HWM, one below LWM and 1 Empty. No ScaleIn.
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 800000000 Units 8000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 800000000 Units 8000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 100000000 UnitsUsage 1000 
2023-02-16T18:25:01.537+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.537+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.537+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 100000000 UnitsUsage 1000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.537+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.537+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9005 127.0.0.1:9002]
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 100000000 UnitsUsage 1000 
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.537+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 400000000 Units 4000
2023-02-16T18:25:01.537+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM []
2023-02-16T18:25:01.537+05:30 [Error] Planner - Not enough capacity to place indexes of deleted nodes.
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 ScaleIn- 3 Subclusters, One above HWM, one below LWM and 1 Empty. ScaleIn. 
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 800000000 Units 8000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 800000000 Units 8000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal TenantsToBeMoved from source [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000 
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 100000000 UnitsUsage 1000 
2023-02-16T18:25:01.538+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters Below LWM [[127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.538+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket4 MemoryUsage 300000000 UnitsUsage 3000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.538+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9001 TenantId bucket2 MemoryUsage 100000000 UnitsUsage 1000  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 400000000 Units 4000
2023-02-16T18:25:01.538+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.538+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.538+05:30 [Info] Planner::findPlacementForDeletedNodes Deleted Nodes [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes nonEmptyDeletedNodes [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes pairForDeletedNodes [127.0.0.1:9005 127.0.0.1:9002]
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes newNodes []
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes Num deleted nodes 2 is more than num new/empty nodes 0
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantsToBeMoved from source [127.0.0.1:9002 127.0.0.1:9005]
2023-02-16T18:25:01.538+05:30 [Info] Planner::moveTenantsFromDeletedNodes TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 100000000 UnitsUsage 1000 
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.538+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9002 SG sg2 Memory 100000000 Units 1000
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9003 SG sg1 Memory 400000000 Units 4000
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 400000000 Units 4000
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Skip Deleted Index Node 127.0.0.1:9005 SG sg3 Memory 100000000 Units 1000
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9006 SG sg3 Memory 400000000 Units 4000
2023-02-16T18:25:01.539+05:30 [Info] Planner::moveTenantsFromDeletedNodes Found SubClusters Below LWM [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9003 127.0.0.1:9006]]
2023-02-16T18:25:01.539+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9002 TenantId bucket3 MemoryUsage 100000000 UnitsUsage 1000  can be placed on [127.0.0.1:9001 127.0.0.1:9004]
2023-02-16T18:25:01.539+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9002 SG sg2 Memory 0 Units 0
2023-02-16T18:25:01.539+05:30 [Info] Planner::findPlacementForDeletedNodes Remove Deleted Node from solution 127.0.0.1:9005 SG sg3 Memory 0 Units 0
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Rebalance - 1 Subcluster, Below HWM (partial replica repair)
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 500000000 Units 800
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 350000000 Units 500
2023-02-16T18:25:01.539+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.539+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket4,,,idx2,0) on 127.0.0.1:9004
2023-02-16T18:25:01.539+05:30 [Info] Planner::placeMissingReplicaOnTarget Rebuilding lost replica for (bucket5,,,idx2,0) on 127.0.0.1:9004
2023-02-16T18:25:01.539+05:30 [Info] Planner::executeTenantAwareRebal Found SubClusters above HWM []
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 500000000 Units 800
2023-02-16T18:25:01.539+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 500000000 Units 800
2023-02-16T18:25:01.540+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004]]
2023-02-16T18:25:01.540+05:30 [Info] Planner::evaluateSolutionForScaleIn Found SubClusters below LWM [[127.0.0.1:9001 127.0.0.1:9004]]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Resume - 1 tenant, Empty node in cluster.
2023-02-16T18:25:01.541+05:30 [Info] Planner::executeTenantAwarePlanForResume Resume Nodes [127.0.0.1:9011 127.0.0.1:9012]
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.541+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.541+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.541+05:30 [Info] Planner::executeTenantAwarePlanForResume TenantToBeResumed TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 250000000 UnitsUsage 250 
2023-02-16T18:25:01.541+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008] [127.0.0.1:9006 127.0.0.1:9003]]
2023-02-16T18:25:01.541+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 250000000 UnitsUsage 250  can be placed on [127.0.0.1:9006 127.0.0.1:9003]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Resume - 1 tenant, No empty node in cluster.
2023-02-16T18:25:01.543+05:30 [Info] Planner::executeTenantAwarePlanForResume Resume Nodes [127.0.0.1:9011 127.0.0.1:9012]
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.543+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.543+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.543+05:30 [Info] Planner::executeTenantAwarePlanForResume TenantToBeResumed TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 250000000 UnitsUsage 250 
2023-02-16T18:25:01.543+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.543+05:30 [Info] Planner::findTenantPlacement TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 250000000 UnitsUsage 250  can be placed on [127.0.0.1:9007 127.0.0.1:9008]
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Resume - 1 tenant, No node below LWM in the cluster.
2023-02-16T18:25:01.545+05:30 [Info] Planner::executeTenantAwarePlanForResume Resume Nodes [127.0.0.1:9011 127.0.0.1:9012]
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 5000
2023-02-16T18:25:01.545+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 5000
2023-02-16T18:25:01.545+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.545+05:30 [Info] Planner::executeTenantAwarePlanForResume TenantToBeResumed TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 250000000 UnitsUsage 250 
2023-02-16T18:25:01.545+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters Below LWM []
2023/02/16 18:25:01 Expected error No SubCluster Below Low Usage Threshold
2023/02/16 18:25:01 -------------------------------------------
2023/02/16 18:25:01 Resume - 1 tenant, Not enough capacity in the cluster.
2023-02-16T18:25:01.546+05:30 [Info] Planner::executeTenantAwarePlanForResume Resume Nodes [127.0.0.1:9011 127.0.0.1:9012]
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9001 SG sg1 Memory 900000000 Units 8000
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9002 SG sg2 Memory 1000000000 Units 7000
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9004 SG sg2 Memory 900000000 Units 8000
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9005 SG sg3 Memory 1000000000 Units 7000
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9007 SG sg1 Memory 200000000 Units 100
2023-02-16T18:25:01.546+05:30 [Info] Planner::groupIndexNodesIntoSubClusters Index Node 127.0.0.1:9008 SG sg2 Memory 200000000 Units 100
2023-02-16T18:25:01.546+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters  [[127.0.0.1:9001 127.0.0.1:9004] [127.0.0.1:9002 127.0.0.1:9005] [127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.546+05:30 [Info] Planner::executeTenantAwarePlanForResume TenantToBeResumed TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 700000000 UnitsUsage 250 
2023-02-16T18:25:01.546+05:30 [Info] Planner::executeTenantAwarePlanForResume Found SubClusters Below LWM [[127.0.0.1:9007 127.0.0.1:9008]]
2023-02-16T18:25:01.546+05:30 [Info] Planner::moveTenantsToLowUsageSubCluster Unable to place TenantUsage - SourceId 127.0.0.1:9011 TenantId resume_tenant_a MemoryUsage 700000000 UnitsUsage 250  on any target
2023/02/16 18:25:01 Expected error Not Enough Capacity To Place Tenant
--- PASS: TestTenantAwarePlanner (0.09s)
=== RUN   TestRestfulAPI
2023/02/16 18:25:01 In TestRestfulAPI()
2023/02/16 18:25:01 In DropAllSecondaryIndexes()
2023/02/16 18:25:01 Index found:  indexmut_1
2023/02/16 18:25:01 Dropped index indexmut_1
2023/02/16 18:25:01 Setting JSON docs in KV
2023/02/16 18:25:02 GET all indexes
2023/02/16 18:25:02 200 OK
2023/02/16 18:25:02 FOUND indexes: []
2023/02/16 18:25:02 DROP index: badindexid
2023/02/16 18:25:02 status: 400 Bad Request
2023/02/16 18:25:02 DROP index: 23544142
2023/02/16 18:25:02 status: 500 Internal Server Error
2023/02/16 18:25:02 TEST: malformed body
2023/02/16 18:25:02 400 Bad Request "invalid request body ({name:), unmarshal failed invalid character 'n' looking for beginning of object key string"

2023/02/16 18:25:02 TEST: missing field ``name``
2023/02/16 18:25:02 400 Bad Request "missing field name"
2023/02/16 18:25:02 TEST: empty field ``name``
2023/02/16 18:25:02 400 Bad Request "empty field name"
2023/02/16 18:25:02 TEST: missing field ``bucket``
2023/02/16 18:25:02 400 Bad Request "missing field bucket"
2023/02/16 18:25:02 TEST: empty field ``bucket``
2023/02/16 18:25:02 400 Bad Request "empty field bucket"
2023/02/16 18:25:02 TEST: missing field ``secExprs``
2023/02/16 18:25:02 400 Bad Request "missing field secExprs"
2023/02/16 18:25:02 TEST: empty field ``secExprs``
2023/02/16 18:25:02 400 Bad Request "empty field secExprs"
2023/02/16 18:25:02 TEST: incomplete field ``desc``
2023/02/16 18:25:02 400 Bad Request "incomplete desc information [true]"
2023/02/16 18:25:02 TEST: invalid field ``desc``
2023/02/16 18:25:02 400 Bad Request "incomplete desc information [1]"
2023/02/16 18:25:02 
2023/02/16 18:25:02 CREATE INDEX: idx1
2023/02/16 18:25:15 status : 201 Created
2023/02/16 18:25:15 {"id": "12756388884168720921"} 
2023/02/16 18:25:15 CREATE INDEX: idx2 (defer)
2023/02/16 18:25:15 status : 201 Created
2023/02/16 18:25:15 {"id": "16813685203519130512"} 
2023/02/16 18:25:15 CREATE INDEX: idx3 (defer)
2023/02/16 18:25:15 status : 201 Created
2023/02/16 18:25:15 {"id": "4316629429755233515"} 
2023/02/16 18:25:15 CREATE INDEX: idx4 (defer)
2023/02/16 18:25:15 status : 201 Created
2023/02/16 18:25:15 {"id": "12883999299824769090"} 
2023/02/16 18:25:15 CREATE INDEX: idx5
2023/02/16 18:25:29 status : 201 Created
2023/02/16 18:25:29 {"id": "2614057193255766268"} 
2023/02/16 18:25:29 BUILD single deferred index
2023/02/16 18:25:29 202 Accepted
2023/02/16 18:25:29 GET all indexes
2023/02/16 18:25:29 200 OK
2023/02/16 18:25:29 index idx1 in INDEX_STATE_ACTIVE
2023/02/16 18:25:29 GET all indexes
2023/02/16 18:25:29 200 OK
2023/02/16 18:25:29 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:30 GET all indexes
2023/02/16 18:25:30 200 OK
2023/02/16 18:25:30 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:31 GET all indexes
2023/02/16 18:25:31 200 OK
2023/02/16 18:25:31 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:32 GET all indexes
2023/02/16 18:25:32 200 OK
2023/02/16 18:25:32 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:33 GET all indexes
2023/02/16 18:25:33 200 OK
2023/02/16 18:25:33 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:34 GET all indexes
2023/02/16 18:25:34 200 OK
2023/02/16 18:25:34 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:35 GET all indexes
2023/02/16 18:25:35 200 OK
2023/02/16 18:25:35 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:36 GET all indexes
2023/02/16 18:25:36 200 OK
2023/02/16 18:25:36 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:37 GET all indexes
2023/02/16 18:25:37 200 OK
2023/02/16 18:25:37 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:38 GET all indexes
2023/02/16 18:25:38 200 OK
2023/02/16 18:25:38 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:39 GET all indexes
2023/02/16 18:25:39 200 OK
2023/02/16 18:25:39 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:40 GET all indexes
2023/02/16 18:25:40 200 OK
2023/02/16 18:25:40 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:41 GET all indexes
2023/02/16 18:25:41 200 OK
2023/02/16 18:25:41 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:42 GET all indexes
2023/02/16 18:25:43 200 OK
2023/02/16 18:25:43 index idx2 in INDEX_STATE_INITIAL
2023/02/16 18:25:44 GET all indexes
2023/02/16 18:25:44 200 OK
2023/02/16 18:25:44 index idx2 in INDEX_STATE_CATCHUP
2023/02/16 18:25:45 GET all indexes
2023/02/16 18:25:45 200 OK
2023/02/16 18:25:45 index idx2 in INDEX_STATE_ACTIVE
2023/02/16 18:25:45 BUILD many deferred index
2023/02/16 18:25:45 202 Accepted 
2023/02/16 18:25:45 GET all indexes
2023/02/16 18:25:45 200 OK
2023/02/16 18:25:45 index idx1 in INDEX_STATE_ACTIVE
2023/02/16 18:25:45 GET all indexes
2023/02/16 18:25:45 200 OK
2023/02/16 18:25:45 index idx2 in INDEX_STATE_ACTIVE
2023/02/16 18:25:45 GET all indexes
2023/02/16 18:25:45 200 OK
2023/02/16 18:25:45 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:46 GET all indexes
2023/02/16 18:25:46 200 OK
2023/02/16 18:25:46 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:47 GET all indexes
2023/02/16 18:25:47 200 OK
2023/02/16 18:25:47 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:48 GET all indexes
2023/02/16 18:25:48 200 OK
2023/02/16 18:25:48 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:49 GET all indexes
2023/02/16 18:25:49 200 OK
2023/02/16 18:25:49 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:50 GET all indexes
2023/02/16 18:25:50 200 OK
2023/02/16 18:25:50 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:51 GET all indexes
2023/02/16 18:25:51 200 OK
2023/02/16 18:25:51 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:52 GET all indexes
2023/02/16 18:25:52 200 OK
2023/02/16 18:25:52 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:53 GET all indexes
2023/02/16 18:25:53 200 OK
2023/02/16 18:25:53 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:54 GET all indexes
2023/02/16 18:25:54 200 OK
2023/02/16 18:25:54 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:55 GET all indexes
2023/02/16 18:25:55 200 OK
2023/02/16 18:25:55 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:56 GET all indexes
2023/02/16 18:25:56 200 OK
2023/02/16 18:25:56 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:57 GET all indexes
2023/02/16 18:25:57 200 OK
2023/02/16 18:25:57 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:58 GET all indexes
2023/02/16 18:25:58 200 OK
2023/02/16 18:25:58 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:25:59 GET all indexes
2023/02/16 18:25:59 200 OK
2023/02/16 18:25:59 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:26:00 GET all indexes
2023/02/16 18:26:00 200 OK
2023/02/16 18:26:00 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:26:01 GET all indexes
2023/02/16 18:26:01 200 OK
2023/02/16 18:26:01 index idx3 in INDEX_STATE_INITIAL
2023/02/16 18:26:02 GET all indexes
2023/02/16 18:26:02 200 OK
2023/02/16 18:26:02 index idx3 in INDEX_STATE_ACTIVE
2023/02/16 18:26:02 GET all indexes
2023/02/16 18:26:02 200 OK
2023/02/16 18:26:02 index idx4 in INDEX_STATE_ACTIVE
2023/02/16 18:26:02 GET all indexes
2023/02/16 18:26:03 200 OK
2023/02/16 18:26:03 index idx5 in INDEX_STATE_ACTIVE
2023/02/16 18:26:03 GET all indexes
2023/02/16 18:26:03 200 OK
2023/02/16 18:26:03 CREATED indexes: [12756388884168720921 16813685203519130512 4316629429755233515 12883999299824769090 2614057193255766268]
2023/02/16 18:26:03 
2023/02/16 18:26:03 LOOKUP missing index
2023/02/16 18:26:03 status : 404 Not Found
2023/02/16 18:26:03 LOOKUP Pyongyang
2023/02/16 18:26:03 status : 200 OK
2023/02/16 18:26:03 number of entries 554
2023/02/16 18:26:03 Expected and Actual scan responses are the same
2023/02/16 18:26:03 LOOKUP with stale as false
2023/02/16 18:26:03 status : 200 OK
2023/02/16 18:26:03 number of entries 554
2023/02/16 18:26:03 Expected and Actual scan responses are the same
2023/02/16 18:26:03 LOOKUP with Rome
2023/02/16 18:26:03 status : 200 OK
2023/02/16 18:26:03 number of entries 540
2023/02/16 18:26:03 Expected and Actual scan responses are the same
2023/02/16 18:26:03 RANGE missing index
2023/02/16 18:26:03 Status : 404 Not Found
2023/02/16 18:26:03 RANGE cities - none
2023/02/16 18:26:03 Status : 200 OK
2023/02/16 18:26:08 number of entries 140902
2023/02/16 18:26:09 Expected and Actual scan responses are the same
2023/02/16 18:26:09 RANGE cities -low
2023/02/16 18:26:09 Status : 200 OK
2023/02/16 18:26:14 number of entries 140902
2023/02/16 18:26:15 Expected and Actual scan responses are the same
2023/02/16 18:26:15 RANGE cities -high
2023/02/16 18:26:15 Status : 200 OK
2023/02/16 18:26:19 number of entries 140902
2023/02/16 18:26:20 Expected and Actual scan responses are the same
2023/02/16 18:26:20 RANGE cities - both
2023/02/16 18:26:20 Status : 200 OK
2023/02/16 18:26:25 number of entries 140902
2023/02/16 18:26:26 Expected and Actual scan responses are the same
2023/02/16 18:26:26 RANGE missing cities
2023/02/16 18:26:26 Status : 200 OK
2023/02/16 18:26:26 number of entries 0
2023/02/16 18:26:26 Expected and Actual scan responses are the same
2023/02/16 18:26:26 
2023/02/16 18:26:26 SCANALL missing index
2023/02/16 18:26:26 {"limit":1000000,"stale":"ok"}
2023/02/16 18:26:26 Status : 404 Not Found
2023/02/16 18:26:26 SCANALL stale ok
2023/02/16 18:26:26 {"limit":1000000,"stale":"ok"}
2023/02/16 18:26:26 Status : 200 OK
2023/02/16 18:26:31 number of entries 140902
2023/02/16 18:26:32 Expected and Actual scan responses are the same
2023/02/16 18:26:32 SCANALL stale false
2023/02/16 18:26:32 {"limit":1000000,"stale":"false"}
2023/02/16 18:26:32 Status : 200 OK
2023/02/16 18:26:37 number of entries 140902
2023/02/16 18:26:38 Expected and Actual scan responses are the same
2023/02/16 18:26:38 
2023/02/16 18:26:38 COUNT missing index
2023/02/16 18:26:38 Status : 404 Not Found
2023/02/16 18:26:38 COUNT cities - none
2023/02/16 18:26:38 Status : 200 OK
2023/02/16 18:26:38 number of entries 140902
2023/02/16 18:26:38 COUNT cities -low
2023/02/16 18:26:38 Status : 200 OK
2023/02/16 18:26:38 number of entries 140902
2023/02/16 18:26:38 COUNT cities -high
2023/02/16 18:26:38 Status : 200 OK
2023/02/16 18:26:38 number of entries 140902
2023/02/16 18:26:39 COUNT cities - both
2023/02/16 18:26:39 Status : 200 OK
2023/02/16 18:26:39 number of entries 140902
2023/02/16 18:26:39 COUNT missing cities
2023/02/16 18:26:39 Status : 200 OK
2023/02/16 18:26:39 number of entries 0
2023/02/16 18:26:39 
2023/02/16 18:26:40 STATS: Testing URLs with valid authentication
2023/02/16 18:26:40 STATS: Testing URLs with invalid authentication
2023/02/16 18:26:40 STATS: Testing invalid URLs
2023/02/16 18:26:40 STATS: Testing unsupported methods
2023/02/16 18:26:40 
--- PASS: TestRestfulAPI (99.44s)
=== RUN   TestStatIndexInstFilter
2023/02/16 18:26:40 CREATE INDEX: statIdx1
2023/02/16 18:26:54 status : 201 Created
2023/02/16 18:26:54 {"id": "344146792708732832"} 
2023/02/16 18:26:54 CREATE INDEX: statIdx2
2023/02/16 18:27:09 status : 201 Created
2023/02/16 18:27:09 {"id": "16379638936592745928"} 
2023/02/16 18:27:09 Instance Id for statIdx2 is 2155547273360106157, common.IndexInstId
--- PASS: TestStatIndexInstFilter (28.32s)
=== RUN   TestBucketDefaultDelete
2023-02-16T18:27:09.450+05:30 [Warn] Client:runObserveStreamingEndpoint streaming endpoint for /pools/default/bs/default returned err EOF
2023-02-16T18:27:09.450+05:30 [Warn] serviceChangeNotifier: Connection terminated for collection manifest notifier instance of http://%40query@127.0.0.1:9000, default, bucket: default, (EOF)
2023/02/16 18:27:11 Deleted bucket default, responseBody: 
2023/02/16 18:27:26 Created bucket default, responseBody: 
2023/02/16 18:27:42 Populating the default bucket
2023/02/16 18:27:50 Using n1ql client
2023-02-16T18:27:50.646+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:27:50.646+05:30 [Info] GSIC[default/default-_default-_default-1676552270644684313] started ...
2023/02/16 18:27:50 Scan failed as expected with error: Index Not Found - cause: GSI index index_isActive not found.
2023/02/16 18:27:50 Populating the default bucket after it was deleted
2023/02/16 18:28:00 Created the secondary index index_isActive. Waiting for it become active
2023/02/16 18:28:00 Index is 7650372445131674090 now active
2023/02/16 18:28:00 Using n1ql client
2023/02/16 18:28:00 Expected and Actual scan responses are the same
--- PASS: TestBucketDefaultDelete (51.61s)
=== RUN   TestMixedDatatypesScanAll
2023/02/16 18:28:00 In TestMixedDatatypesScanAll()
2023/02/16 18:28:00 Before test begin: Length of kv docs is 10002
2023/02/16 18:28:00 In DropAllSecondaryIndexes()
2023/02/16 18:28:00 Index found:  index_isActive
2023/02/16 18:28:01 Dropped index index_isActive
2023/02/16 18:28:01 Number of number fields is: 245
2023/02/16 18:28:01 Number of string fields is: 273
2023/02/16 18:28:01 Number of json fields is: 239
2023/02/16 18:28:01 Number of true bool fields is: 138
2023/02/16 18:28:01 Number of false bool fields is: 105
2023/02/16 18:28:01 After generate docs: Length of kv docs is 11002
2023/02/16 18:28:01 Setting mixed datatypes JSON docs in KV
2023/02/16 18:28:06 Created the secondary index index_mixeddt. Waiting for it become active
2023/02/16 18:28:06 Index is 114695629212911160 now active
2023/02/16 18:28:06 Using n1ql client
2023/02/16 18:28:06 Expected and Actual scan responses are the same
2023/02/16 18:28:06 Lengths of expected and actual scan results are:  1000 and 1000
2023/02/16 18:28:06 End: Length of kv docs is 11002
--- PASS: TestMixedDatatypesScanAll (6.05s)
=== RUN   TestMixedDatatypesRange_Float
2023/02/16 18:28:06 In TestMixedDatatypesRange_Float()
2023/02/16 18:28:06 In DropAllSecondaryIndexes()
2023/02/16 18:28:06 Index found:  index_mixeddt
2023/02/16 18:28:07 Dropped index index_mixeddt
2023/02/16 18:28:07 Number of number fields is: 255
2023/02/16 18:28:07 Number of string fields is: 264
2023/02/16 18:28:07 Number of json fields is: 235
2023/02/16 18:28:07 Number of true bool fields is: 132
2023/02/16 18:28:07 Number of false bool fields is: 114
2023/02/16 18:28:07 Setting mixed datatypes JSON docs in KV
2023/02/16 18:28:12 Created the secondary index index_mixeddt. Waiting for it become active
2023/02/16 18:28:12 Index is 2507552047564377552 now active
2023/02/16 18:28:12 Using n1ql client
2023/02/16 18:28:13 Expected and Actual scan responses are the same
2023/02/16 18:28:13 Lengths of expected and actual scan results are:  19 and 19
2023/02/16 18:28:13 Using n1ql client
2023/02/16 18:28:13 Expected and Actual scan responses are the same
2023/02/16 18:28:13 Lengths of expected and actual scan results are:  0 and 0
2023/02/16 18:28:13 Length of kv docs is 12002
--- PASS: TestMixedDatatypesRange_Float (6.05s)
=== RUN   TestMixedDatatypesRange_String
2023/02/16 18:28:13 In TestMixedDatatypesRange_String()
2023/02/16 18:28:13 In DropAllSecondaryIndexes()
2023/02/16 18:28:13 Index found:  index_mixeddt
2023/02/16 18:28:13 Dropped index index_mixeddt
2023/02/16 18:28:13 Number of number fields is: 250
2023/02/16 18:28:13 Number of string fields is: 249
2023/02/16 18:28:13 Number of json fields is: 238
2023/02/16 18:28:13 Number of true bool fields is: 137
2023/02/16 18:28:13 Number of false bool fields is: 126
2023/02/16 18:28:13 Setting mixed datatypes JSON docs in KV
2023/02/16 18:28:19 Created the secondary index index_mixeddt. Waiting for it become active
2023/02/16 18:28:19 Index is 14251087899245821764 now active
2023/02/16 18:28:19 Using n1ql client
2023/02/16 18:28:19 Expected and Actual scan responses are the same
2023/02/16 18:28:19 Lengths of expected and actual scan results are:  190 and 190
2023/02/16 18:28:19 Length of kv docs is 13002
--- PASS: TestMixedDatatypesRange_String (6.20s)
=== RUN   TestMixedDatatypesRange_Json
2023/02/16 18:28:19 In TestMixedDatatypesRange_Json()
2023/02/16 18:28:19 In DropAllSecondaryIndexes()
2023/02/16 18:28:19 Index found:  index_mixeddt
2023/02/16 18:28:19 Dropped index index_mixeddt
2023/02/16 18:28:19 Number of number fields is: 255
2023/02/16 18:28:19 Number of string fields is: 265
2023/02/16 18:28:19 Number of json fields is: 256
2023/02/16 18:28:19 Number of true bool fields is: 121
2023/02/16 18:28:19 Number of false bool fields is: 103
2023/02/16 18:28:19 Setting mixed datatypes JSON docs in KV
2023/02/16 18:28:25 Created the secondary index index_mixeddt. Waiting for it become active
2023/02/16 18:28:25 Index is 13849286188525126541 now active
2023/02/16 18:28:25 Using n1ql client
2023/02/16 18:28:25 Expected and Actual scan responses are the same
2023/02/16 18:28:25 Lengths of expected and actual scan results are:  729 and 729
2023/02/16 18:28:25 Length of kv docs is 14002
--- PASS: TestMixedDatatypesRange_Json (6.00s)
=== RUN   TestMixedDatatypesScan_Bool
2023/02/16 18:28:25 In TestMixedDatatypesScan_Bool()
2023/02/16 18:28:25 In DropAllSecondaryIndexes()
2023/02/16 18:28:25 Index found:  index_mixeddt
2023/02/16 18:28:25 Dropped index index_mixeddt
2023/02/16 18:28:25 Number of number fields is: 242
2023/02/16 18:28:25 Number of string fields is: 237
2023/02/16 18:28:25 Number of json fields is: 262
2023/02/16 18:28:25 Number of true bool fields is: 137
2023/02/16 18:28:25 Number of false bool fields is: 122
2023/02/16 18:28:25 Setting mixed datatypes JSON docs in KV
2023/02/16 18:28:31 Created the secondary index index_mixeddt. Waiting for it become active
2023/02/16 18:28:31 Index is 6431381281782900906 now active
2023/02/16 18:28:31 Using n1ql client
2023/02/16 18:28:32 Expected and Actual scan responses are the same
2023/02/16 18:28:32 Lengths of expected and actual scan results are:  527 and 527
2023/02/16 18:28:32 Using n1ql client
2023/02/16 18:28:32 Expected and Actual scan responses are the same
2023/02/16 18:28:32 Lengths of expected and actual scan results are:  465 and 465
2023/02/16 18:28:32 Length of kv docs is 15002
--- PASS: TestMixedDatatypesScan_Bool (6.83s)
=== RUN   TestLargeSecondaryKeyLength
2023/02/16 18:28:32 In TestLargeSecondaryKeyLength()
2023/02/16 18:28:32 In DropAllSecondaryIndexes()
2023/02/16 18:28:32 Index found:  index_mixeddt
2023/02/16 18:28:32 Dropped index index_mixeddt
2023/02/16 18:28:32 Setting JSON docs in KV
2023/02/16 18:28:39 Created the secondary index index_LongSecField. Waiting for it become active
2023/02/16 18:28:39 Index is 3064415603123624659 now active
2023/02/16 18:28:39 Using n1ql client
2023/02/16 18:28:39 ScanAll: Lengths of expected and actual scan results are:  1000 and 1000
2023/02/16 18:28:39 Expected and Actual scan responses are the same
2023/02/16 18:28:39 Using n1ql client
2023/02/16 18:28:39 Range: Lengths of expected and actual scan results are:  822 and 822
2023/02/16 18:28:39 Expected and Actual scan responses are the same
2023/02/16 18:28:39 End: Length of kv docs is 16002
--- PASS: TestLargeSecondaryKeyLength (7.34s)
=== RUN   TestLargePrimaryKeyLength
2023/02/16 18:28:39 In TestLargePrimaryKeyLength()
2023/02/16 18:28:39 In DropAllSecondaryIndexes()
2023/02/16 18:28:39 Index found:  index_LongSecField
2023/02/16 18:28:39 Dropped index index_LongSecField
2023/02/16 18:28:39 Setting JSON docs in KV
2023/02/16 18:28:45 Created the secondary index index_LongPrimaryField. Waiting for it become active
2023/02/16 18:28:45 Index is 18034819683579757448 now active
2023/02/16 18:28:45 Using n1ql client
2023/02/16 18:28:46 Lengths of num of docs and scanResults are:  17002 and 17002
2023/02/16 18:28:46 End: Length of kv docs is 17002
--- PASS: TestLargePrimaryKeyLength (6.72s)
=== RUN   TestUpdateMutations_DeleteField
2023/02/16 18:28:46 In TestUpdateMutations_DeleteField()
2023/02/16 18:28:46 Setting JSON docs in KV
2023/02/16 18:28:54 Created the secondary index index_bal. Waiting for it become active
2023/02/16 18:28:54 Index is 14501187995255716101 now active
2023/02/16 18:28:54 Using n1ql client
2023/02/16 18:28:54 Expected and Actual scan responses are the same
2023/02/16 18:28:54 Using n1ql client
2023/02/16 18:28:54 Expected and Actual scan responses are the same
--- PASS: TestUpdateMutations_DeleteField (8.69s)
=== RUN   TestUpdateMutations_AddField
2023/02/16 18:28:54 In TestUpdateMutations_AddField()
2023/02/16 18:28:55 Setting JSON docs in KV
2023/02/16 18:29:02 Created the secondary index index_newField. Waiting for it become active
2023/02/16 18:29:02 Index is 5345917510850898862 now active
2023/02/16 18:29:02 Using n1ql client
2023/02/16 18:29:02 Count of scan results before add field mutations:  0
2023/02/16 18:29:02 Expected and Actual scan responses are the same
2023/02/16 18:29:02 Using n1ql client
2023/02/16 18:29:02 Count of scan results after add field mutations:  300
2023/02/16 18:29:02 Expected and Actual scan responses are the same
--- PASS: TestUpdateMutations_AddField (7.82s)
=== RUN   TestUpdateMutations_DataTypeChange
2023/02/16 18:29:02 In TestUpdateMutations_DataTypeChange()
2023/02/16 18:29:02 Setting JSON docs in KV
2023/02/16 18:29:11 Created the secondary index index_isUserActive. Waiting for it become active
2023/02/16 18:29:11 Index is 12153097790160609969 now active
2023/02/16 18:29:11 Using n1ql client
2023/02/16 18:29:11 Expected and Actual scan responses are the same
2023/02/16 18:29:11 Using n1ql client
2023/02/16 18:29:11 Expected and Actual scan responses are the same
2023/02/16 18:29:11 Using n1ql client
2023/02/16 18:29:11 Expected and Actual scan responses are the same
2023/02/16 18:29:11 Using n1ql client
2023/02/16 18:29:11 Expected and Actual scan responses are the same
--- PASS: TestUpdateMutations_DataTypeChange (9.08s)
=== RUN   TestMultipleBuckets
2023/02/16 18:29:11 In TestMultipleBuckets()
2023/02/16 18:29:11 In DropAllSecondaryIndexes()
2023/02/16 18:29:11 Index found:  index_newField
2023/02/16 18:29:11 Dropped index index_newField
2023/02/16 18:29:11 Index found:  index_isUserActive
2023/02/16 18:29:11 Dropped index index_isUserActive
2023/02/16 18:29:11 Index found:  index_bal
2023/02/16 18:29:12 Dropped index index_bal
2023/02/16 18:29:12 Index found:  index_LongPrimaryField
2023/02/16 18:29:12 Dropped index index_LongPrimaryField
2023/02/16 18:29:50 Flushed the bucket default, Response body: 
2023/02/16 18:29:53 Modified parameters of bucket default, responseBody: 
2023/02/16 18:29:53 Created bucket testbucket2, responseBody: 
2023/02/16 18:29:53 Created bucket testbucket3, responseBody: 
2023/02/16 18:29:53 Created bucket testbucket4, responseBody: 
2023/02/16 18:30:08 Generating docs and Populating all the buckets
2023/02/16 18:30:12 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 18:30:12 Index is 18268106177963141898 now active
2023/02/16 18:30:18 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 18:30:18 Index is 1761460027071740264 now active
2023/02/16 18:30:25 Created the secondary index bucket3_gender. Waiting for it become active
2023/02/16 18:30:25 Index is 4839912137826828390 now active
2023/02/16 18:30:31 Created the secondary index bucket4_balance. Waiting for it become active
2023/02/16 18:30:31 Index is 4176401441386130872 now active
2023/02/16 18:30:34 Using n1ql client
2023/02/16 18:30:34 Expected and Actual scan responses are the same
2023/02/16 18:30:34 Using n1ql client
2023-02-16T18:30:34.534+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:30:34.535+05:30 [Info] GSIC[default/testbucket2-_default-_default-1676552434527526686] started ...
2023/02/16 18:30:34 Expected and Actual scan responses are the same
2023/02/16 18:30:34 Using n1ql client
2023-02-16T18:30:34.548+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:30:34.550+05:30 [Info] GSIC[default/testbucket3-_default-_default-1676552434546521530] started ...
2023/02/16 18:30:34 Expected and Actual scan responses are the same
2023/02/16 18:30:34 Using n1ql client
2023-02-16T18:30:34.564+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:30:34.564+05:30 [Info] GSIC[default/testbucket4-_default-_default-1676552434559697937] started ...
2023/02/16 18:30:34 Expected and Actual scan responses are the same
2023/02/16 18:30:37 Deleted bucket testbucket2, responseBody: 
2023/02/16 18:30:39 Deleted bucket testbucket3, responseBody: 
2023/02/16 18:30:41 Deleted bucket testbucket4, responseBody: 
2023/02/16 18:30:44 Modified parameters of bucket default, responseBody: 
--- PASS: TestMultipleBuckets (107.52s)
=== RUN   TestBucketFlush
2023/02/16 18:30:59 In TestBucketFlush()
2023/02/16 18:30:59 In DropAllSecondaryIndexes()
2023/02/16 18:30:59 Index found:  bucket1_age
2023/02/16 18:30:59 Dropped index bucket1_age
2023/02/16 18:31:36 Flushed the bucket default, Response body: 
2023/02/16 18:31:40 Created the secondary index index_age. Waiting for it become active
2023/02/16 18:31:40 Index is 16889007554721919354 now active
2023/02/16 18:31:40 Using n1ql client
2023/02/16 18:31:41 Expected and Actual scan responses are the same
2023/02/16 18:31:47 Created the secondary index index_gender. Waiting for it become active
2023/02/16 18:31:47 Index is 11082697904116142840 now active
2023/02/16 18:31:47 Using n1ql client
2023/02/16 18:31:47 Expected and Actual scan responses are the same
2023/02/16 18:31:53 Created the secondary index index_city. Waiting for it become active
2023/02/16 18:31:53 Index is 17366468897172852381 now active
2023/02/16 18:31:53 Using n1ql client
2023/02/16 18:31:53 Expected and Actual scan responses are the same
2023/02/16 18:32:31 Flushed the bucket default, Response body: 
2023/02/16 18:32:31 TestBucketFlush:: Flushed the bucket
2023/02/16 18:32:31 Using n1ql client
2023/02/16 18:32:31 Using n1ql client
2023/02/16 18:32:31 Using n1ql client
--- PASS: TestBucketFlush (92.02s)
=== RUN   TestLargeDocumentSize
2023/02/16 18:32:31 In TestLargeDocumentSize()
2023/02/16 18:32:31 Data file exists. Skipping download
2023/02/16 18:32:31 Length of docs and largeDocs = 200 and 200
2023/02/16 18:32:35 Created the secondary index index_userscreenname. Waiting for it become active
2023/02/16 18:32:35 Index is 11307817070856333411 now active
2023/02/16 18:32:35 Using n1ql client
2023/02/16 18:32:35 Expected and Actual scan responses are the same
--- PASS: TestLargeDocumentSize (4.50s)
=== RUN   TestFieldsWithSpecialCharacters
2023/02/16 18:32:35 In TestFieldsWithSpecialCharacters()
2023/02/16 18:32:42 Created the secondary index index_specialchar. Waiting for it become active
2023/02/16 18:32:42 Index is 7086369011612779334 now active
2023/02/16 18:32:42 Looking up for value §£††©€#
2023/02/16 18:32:42 Using n1ql client
2023/02/16 18:32:42 Expected and Actual scan responses are the same
--- PASS: TestFieldsWithSpecialCharacters (6.71s)
=== RUN   TestLargeKeyLookup
2023/02/16 18:32:42 In TestLargeKeyLookup()
2023/02/16 18:32:49 Created the secondary index index_largeKeyLookup. Waiting for it become active
2023/02/16 18:32:49 Index is 13614494586208972110 now active
2023/02/16 18:32:49 Looking up for a large key
2023/02/16 18:32:49 Using n1ql client
2023/02/16 18:32:49 Expected and Actual scan responses are the same
--- PASS: TestLargeKeyLookup (7.05s)
=== RUN   TestIndexNameValidation
2023/02/16 18:32:49 In TestIndexNameValidation()
2023/02/16 18:32:49 Setting JSON docs in KV
2023/02/16 18:32:50 Creation of index with invalid name ÌñÐÉx&(abc_% failed as expected
2023/02/16 18:32:56 Created the secondary index #primary-Index_test. Waiting for it become active
2023/02/16 18:32:56 Index is 5366378908067396759 now active
2023/02/16 18:32:56 Using n1ql client
2023/02/16 18:32:56 Expected and Actual scan responses are the same
--- PASS: TestIndexNameValidation (6.74s)
=== RUN   TestSameFieldNameAtDifferentLevels
2023/02/16 18:32:56 In TestSameFieldNameAtDifferentLevels()
2023/02/16 18:32:56 Setting JSON docs in KV
2023/02/16 18:33:03 Created the secondary index cityindex. Waiting for it become active
2023/02/16 18:33:03 Index is 1181035898816119101 now active
2023/02/16 18:33:03 Using n1ql client
2023/02/16 18:33:03 Expected and Actual scan responses are the same
--- PASS: TestSameFieldNameAtDifferentLevels (7.18s)
=== RUN   TestSameIndexNameInTwoBuckets
2023/02/16 18:33:03 In TestSameIndexNameInTwoBuckets()
2023/02/16 18:33:03 In DropAllSecondaryIndexes()
2023/02/16 18:33:03 Index found:  index_gender
2023/02/16 18:33:03 Dropped index index_gender
2023/02/16 18:33:03 Index found:  cityindex
2023/02/16 18:33:03 Dropped index cityindex
2023/02/16 18:33:03 Index found:  index_userscreenname
2023/02/16 18:33:03 Dropped index index_userscreenname
2023/02/16 18:33:03 Index found:  index_city
2023/02/16 18:33:03 Dropped index index_city
2023/02/16 18:33:03 Index found:  index_age
2023/02/16 18:33:03 Dropped index index_age
2023/02/16 18:33:03 Index found:  index_largeKeyLookup
2023/02/16 18:33:03 Dropped index index_largeKeyLookup
2023/02/16 18:33:03 Index found:  #primary-Index_test
2023/02/16 18:33:04 Dropped index #primary-Index_test
2023/02/16 18:33:04 Index found:  index_specialchar
2023/02/16 18:33:04 Dropped index index_specialchar
2023/02/16 18:33:42 Flushed the bucket default, Response body: 
2023/02/16 18:33:45 Modified parameters of bucket default, responseBody: 
2023/02/16 18:33:45 Created bucket buck2, responseBody: 
2023/02/16 18:34:00 Generating docs and Populating all the buckets
2023/02/16 18:34:04 Created the secondary index b_idx. Waiting for it become active
2023/02/16 18:34:04 Index is 6188120644973609141 now active
2023/02/16 18:34:10 Created the secondary index b_idx. Waiting for it become active
2023/02/16 18:34:10 Index is 3349981893020134699 now active
2023/02/16 18:34:13 Using n1ql client
2023/02/16 18:34:13 Expected and Actual scan responses are the same
2023/02/16 18:34:13 Using n1ql client
2023-02-16T18:34:13.880+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:34:13.880+05:30 [Info] GSIC[default/buck2-_default-_default-1676552653877003020] started ...
2023/02/16 18:34:13 Expected and Actual scan responses are the same
2023/02/16 18:34:16 Modified parameters of bucket default, responseBody: 
2023/02/16 18:34:18 Deleted bucket buck2, responseBody: 
--- PASS: TestSameIndexNameInTwoBuckets (90.22s)
=== RUN   TestLargeKeysSplChars
2023/02/16 18:34:33 In TestLargeKeysSplChars()
2023/02/16 18:34:43 Created the secondary index idspl1. Waiting for it become active
2023/02/16 18:34:43 Index is 6742004314034937360 now active
2023/02/16 18:34:51 Created the secondary index idspl2. Waiting for it become active
2023/02/16 18:34:51 Index is 6733845757248370397 now active
2023/02/16 18:34:59 Created the secondary index idspl3. Waiting for it become active
2023/02/16 18:34:59 Index is 7488792319015021773 now active
2023/02/16 18:34:59 Using n1ql client
2023/02/16 18:34:59 Expected and Actual scan responses are the same
2023-02-16T18:34:59.177+05:30 [Error] transport error between 127.0.0.1:34686->127.0.0.1:9107: write tcp 127.0.0.1:34686->127.0.0.1:9107: write: broken pipe
2023-02-16T18:34:59.177+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"]  request transport failed `write tcp 127.0.0.1:34686->127.0.0.1:9107: write: broken pipe`
2023-02-16T18:34:59.177+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T18:34:59.177+05:30 [Error] metadataClient:PickRandom: Replicas - [5267731719793666803], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 18:34:59 Expected and Actual scan responses are the same
2023/02/16 18:34:59 Using n1ql client
2023/02/16 18:34:59 Expected and Actual scan responses are the same
--- PASS: TestLargeKeysSplChars (25.92s)
=== RUN   TestVeryLargeIndexKey
2023/02/16 18:34:59 In DropAllSecondaryIndexes()
2023/02/16 18:34:59 Index found:  idspl1
2023/02/16 18:34:59 Dropped index idspl1
2023/02/16 18:34:59 Index found:  idspl2
2023/02/16 18:34:59 Dropped index idspl2
2023/02/16 18:34:59 Index found:  b_idx
2023/02/16 18:34:59 Dropped index b_idx
2023/02/16 18:34:59 Index found:  idspl3
2023/02/16 18:34:59 Dropped index idspl3
2023/02/16 18:35:37 Flushed the bucket default, Response body: 
2023/02/16 18:35:37 TestVeryLargeIndexKey:: Flushed the bucket
2023/02/16 18:35:38 clusterconfig.KVAddress = 127.0.0.1:9000
2023/02/16 18:35:42 Created the secondary index i1. Waiting for it become active
2023/02/16 18:35:42 Index is 8548917636709560856 now active
2023/02/16 18:35:42 Using n1ql client
2023/02/16 18:35:43 Expected and Actual scan responses are the same
2023/02/16 18:35:49 Created the secondary index i2. Waiting for it become active
2023/02/16 18:35:49 Index is 14089806881038130634 now active
2023/02/16 18:35:49 Using n1ql client
2023/02/16 18:35:50 Expected and Actual scan responses are the same
2023/02/16 18:35:50 In DropAllSecondaryIndexes()
2023/02/16 18:35:50 Index found:  i1
2023/02/16 18:35:50 Dropped index i1
2023/02/16 18:35:50 Index found:  i2
2023/02/16 18:35:51 Dropped index i2
2023/02/16 18:36:29 Flushed the bucket default, Response body: 
--- PASS: TestVeryLargeIndexKey (89.67s)
=== RUN   TestTempBufScanResult
2023/02/16 18:36:29 In DropAllSecondaryIndexes()
2023/02/16 18:37:07 Flushed the bucket default, Response body: 
2023/02/16 18:37:07 TestTempBufScanResult:: Flushed the bucket
2023/02/16 18:37:10 Created the secondary index index_idxKey. Waiting for it become active
2023/02/16 18:37:10 Index is 955523000239907064 now active
2023/02/16 18:37:10 Using n1ql client
2023/02/16 18:37:11 Expected and Actual scan responses are the same
2023/02/16 18:37:11 In DropAllSecondaryIndexes()
2023/02/16 18:37:11 Index found:  index_idxKey
2023/02/16 18:37:11 Dropped index index_idxKey
2023/02/16 18:37:49 Flushed the bucket default, Response body: 
--- PASS: TestTempBufScanResult (80.28s)
=== RUN   TestBuildDeferredAnotherBuilding
2023/02/16 18:37:49 In TestBuildDeferredAnotherBuilding()
2023/02/16 18:37:49 In DropAllSecondaryIndexes()
2023/02/16 18:38:34 Setting JSON docs in KV
2023/02/16 18:40:35 Build the deferred index id_age1. Waiting for the index to become active
2023/02/16 18:40:35 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:36 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:37 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:38 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:39 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:40 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:41 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:42 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:43 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:44 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:45 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:46 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:47 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:48 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:49 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:50 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:51 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:52 Waiting for index 15134976358295455032 to go active ...
2023/02/16 18:40:53 Index is 15134976358295455032 now active
2023/02/16 18:40:53 Build command issued for the deferred indexes [8089475721567930717]
2023/02/16 18:40:56 Build index failed as expected: Build index fails. Index id_age will retry building in the background for reason: Build Already In Progress. Keyspace default.
2023/02/16 18:40:56 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:40:57 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:40:58 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:40:59 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:00 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:01 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:02 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:03 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:04 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:05 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:06 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:07 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:08 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:09 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:10 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:11 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:12 Waiting for index 8089475721567930717 to go active ...
2023/02/16 18:41:13 Index is 8089475721567930717 now active
2023/02/16 18:41:13 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:14 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:15 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:16 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:17 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:18 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:19 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:20 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:21 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:22 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:23 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:24 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:25 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:26 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:27 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:28 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:29 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:30 Waiting for index 17894681919690055494 to go active ...
2023/02/16 18:41:31 Index is 17894681919690055494 now active
2023/02/16 18:41:31 Using n1ql client
2023/02/16 18:41:31 Expected and Actual scan responses are the same
2023/02/16 18:41:31 Using n1ql client
2023/02/16 18:41:32 Expected and Actual scan responses are the same
--- PASS: TestBuildDeferredAnotherBuilding (223.46s)
=== RUN   TestMultipleBucketsDeferredBuild
2023/02/16 18:41:32 In TestMultipleBucketsDeferredBuild()
2023/02/16 18:41:32 In DropAllSecondaryIndexes()
2023/02/16 18:41:32 Index found:  id_age1
2023/02/16 18:41:33 Dropped index id_age1
2023/02/16 18:41:33 Index found:  id_age
2023/02/16 18:41:33 Dropped index id_age
2023/02/16 18:41:33 Index found:  id_company
2023/02/16 18:41:33 Dropped index id_company
2023/02/16 18:42:10 Flushed the bucket default, Response body: 
2023/02/16 18:42:13 Modified parameters of bucket default, responseBody: 
2023/02/16 18:42:13 http://127.0.0.1:9000/pools/default/buckets/defertest_buck2
2023/02/16 18:42:13 &{DELETE http://127.0.0.1:9000/pools/default/buckets/defertest_buck2 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 18:42:13 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:12:13 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc001594e80 31 [] false false map[] 0xc000d48500 }
2023/02/16 18:42:13 DeleteBucket failed for bucket defertest_buck2 
2023/02/16 18:42:13 Deleted bucket defertest_buck2, responseBody: Requested resource not found.
2023/02/16 18:42:13 Created bucket defertest_buck2, responseBody: 
2023/02/16 18:42:28 Setting JSON docs in KV
2023/02/16 18:43:47 Build command issued for the deferred indexes [14744822541116586464]
2023/02/16 18:43:48 Build command issued for the deferred indexes [1673413896236846822 1272328451953909095]
2023/02/16 18:43:48 Index state of 1272328451953909095 is INDEX_STATE_READY
2023/02/16 18:43:48 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:49 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:50 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:51 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:52 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:53 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:54 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:55 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:56 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:57 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:58 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:43:59 Waiting for index 14744822541116586464 to go active ...
2023/02/16 18:44:00 Index is 14744822541116586464 now active
2023/02/16 18:44:00 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:01 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:02 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:03 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:04 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:05 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:06 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:07 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:08 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:09 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:10 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:11 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:12 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:13 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:14 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:15 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:16 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:17 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:18 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:19 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:20 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:21 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:22 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:23 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:24 Waiting for index 1673413896236846822 to go active ...
2023/02/16 18:44:25 Index is 1673413896236846822 now active
2023/02/16 18:44:25 Using n1ql client
2023/02/16 18:44:25 Expected and Actual scan responses are the same
2023/02/16 18:44:25 Using n1ql client
2023/02/16 18:44:25 Expected and Actual scan responses are the same
2023/02/16 18:44:25 Using n1ql client
2023-02-16T18:44:25.702+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T18:44:25.703+05:30 [Info] GSIC[default/defertest_buck2-_default-_default-1676553265698813041] started ...
2023/02/16 18:44:25 Expected and Actual scan responses are the same
2023/02/16 18:44:28 Modified parameters of bucket default, responseBody: 
2023/02/16 18:44:31 Deleted bucket defertest_buck2, responseBody: 
--- PASS: TestMultipleBucketsDeferredBuild (183.17s)
=== RUN   TestCreateDropCreateDeferredIndex
2023/02/16 18:44:36 In TestCreateDropCreateDeferredIndex()
2023/02/16 18:44:36 In DropAllSecondaryIndexes()
2023/02/16 18:44:36 Index found:  buck1_id2
2023/02/16 18:44:36 Dropped index buck1_id2
2023/02/16 18:44:36 Index found:  buck1_id1
2023/02/16 18:44:36 Dropped index buck1_id1
2023/02/16 18:44:38 Setting JSON docs in KV
2023/02/16 18:44:52 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:44:52 Index is 6866562827126708973 now active
2023/02/16 18:44:53 Dropping the secondary index id_age
2023/02/16 18:44:53 Index dropped
2023/02/16 18:44:56 Setting JSON docs in KV
2023/02/16 18:45:05 Using n1ql client
2023/02/16 18:45:05 Expected and Actual scan responses are the same
--- PASS: TestCreateDropCreateDeferredIndex (29.68s)
=== RUN   TestMultipleDeferredIndexes_BuildTogether
2023/02/16 18:45:05 In TestMultipleDeferredIndexes_BuildTogether()
2023/02/16 18:45:05 In DropAllSecondaryIndexes()
2023/02/16 18:45:05 Index found:  id_company
2023/02/16 18:45:06 Dropped index id_company
2023/02/16 18:45:08 Setting JSON docs in KV
2023/02/16 18:45:24 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:45:24 Index is 9845910587579244462 now active
2023/02/16 18:45:26 Build command issued for the deferred indexes [id_age id_gender id_isActive], bucket: default, scope: _default, coll: _default
2023/02/16 18:45:26 Waiting for the index id_age to become active
2023/02/16 18:45:26 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:27 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:28 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:29 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:30 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:31 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:32 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:33 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:34 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:35 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:36 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:37 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:38 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:39 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:40 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:41 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:42 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:43 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:44 Waiting for index 13349798880148584576 to go active ...
2023/02/16 18:45:45 Index is 13349798880148584576 now active
2023/02/16 18:45:45 Waiting for the index id_gender to become active
2023/02/16 18:45:45 Index is 18247501060571244215 now active
2023/02/16 18:45:45 Waiting for the index id_isActive to become active
2023/02/16 18:45:45 Index is 16509689505477589440 now active
2023/02/16 18:45:45 Using n1ql client
2023/02/16 18:45:46 Expected and Actual scan responses are the same
2023/02/16 18:45:48 Setting JSON docs in KV
2023/02/16 18:45:58 Using n1ql client
2023/02/16 18:45:58 Expected and Actual scan responses are the same
2023/02/16 18:45:58 Using n1ql client
2023/02/16 18:45:59 Expected and Actual scan responses are the same
--- PASS: TestMultipleDeferredIndexes_BuildTogether (53.29s)
=== RUN   TestMultipleDeferredIndexes_BuildOneByOne
2023/02/16 18:45:59 In TestMultipleDeferredIndexes_BuildOneByOne()
2023/02/16 18:45:59 In DropAllSecondaryIndexes()
2023/02/16 18:45:59 Index found:  id_age
2023/02/16 18:45:59 Dropped index id_age
2023/02/16 18:45:59 Index found:  id_gender
2023/02/16 18:45:59 Dropped index id_gender
2023/02/16 18:45:59 Index found:  id_company
2023/02/16 18:45:59 Dropped index id_company
2023/02/16 18:45:59 Index found:  id_isActive
2023/02/16 18:45:59 Dropped index id_isActive
2023/02/16 18:46:02 Setting JSON docs in KV
2023/02/16 18:46:17 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:46:17 Index is 16391892505501097584 now active
2023/02/16 18:46:19 Build command issued for the deferred indexes [id_age], bucket: default, scope: _default, coll: _default
2023/02/16 18:46:19 Waiting for the index id_age to become active
2023/02/16 18:46:19 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:20 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:21 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:22 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:23 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:24 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:25 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:26 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:27 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:28 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:29 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:30 Waiting for index 9222921450444853417 to go active ...
2023/02/16 18:46:31 Index is 9222921450444853417 now active
2023/02/16 18:46:31 Build command issued for the deferred indexes [id_gender], bucket: default, scope: _default, coll: _default
2023/02/16 18:46:31 Waiting for the index id_gender to become active
2023/02/16 18:46:31 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:32 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:33 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:34 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:35 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:36 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:37 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:38 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:39 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:40 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:41 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:42 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:43 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:44 Waiting for index 3754938084340535689 to go active ...
2023/02/16 18:46:45 Index is 3754938084340535689 now active
2023/02/16 18:46:45 Build command issued for the deferred indexes [id_isActive], bucket: default, scope: _default, coll: _default
2023/02/16 18:46:45 Waiting for the index id_isActive to become active
2023/02/16 18:46:45 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:46 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:47 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:48 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:49 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:50 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:51 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:52 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:53 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:54 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:55 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:56 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:57 Waiting for index 3998623796986285072 to go active ...
2023/02/16 18:46:58 Index is 3998623796986285072 now active
2023/02/16 18:46:58 Using n1ql client
2023/02/16 18:46:58 Expected and Actual scan responses are the same
2023/02/16 18:47:01 Setting JSON docs in KV
2023/02/16 18:47:11 Using n1ql client
2023/02/16 18:47:11 Expected and Actual scan responses are the same
2023/02/16 18:47:11 Using n1ql client
2023/02/16 18:47:11 Expected and Actual scan responses are the same
--- PASS: TestMultipleDeferredIndexes_BuildOneByOne (72.74s)
=== RUN   TestDropDeferredIndexWhileOthersBuilding
2023/02/16 18:47:11 In TestDropDeferredIndexWhileOthersBuilding()
2023/02/16 18:47:11 In DropAllSecondaryIndexes()
2023/02/16 18:47:11 Index found:  id_age
2023/02/16 18:47:11 Dropped index id_age
2023/02/16 18:47:11 Index found:  id_isActive
2023/02/16 18:47:12 Dropped index id_isActive
2023/02/16 18:47:12 Index found:  id_gender
2023/02/16 18:47:12 Dropped index id_gender
2023/02/16 18:47:12 Index found:  id_company
2023/02/16 18:47:12 Dropped index id_company
2023/02/16 18:47:14 Setting JSON docs in KV
2023/02/16 18:47:33 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:47:33 Index is 3301087679210312912 now active
2023/02/16 18:47:35 Build command issued for the deferred indexes [5531777638213399902 6734362318277528103]
2023/02/16 18:47:37 Dropping the secondary index id_isActive
2023/02/16 18:47:37 Index dropped
2023/02/16 18:47:37 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:38 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:39 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:40 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:41 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:42 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:43 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:44 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:45 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:46 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:47 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:48 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:49 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:50 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:51 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:52 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:53 Waiting for index 5531777638213399902 to go active ...
2023/02/16 18:47:54 Index is 5531777638213399902 now active
2023/02/16 18:47:54 Index is 6734362318277528103 now active
2023/02/16 18:47:55 Using n1ql client
2023/02/16 18:47:55 Expected and Actual scan responses are the same
2023/02/16 18:47:55 Using n1ql client
2023/02/16 18:47:55 Expected and Actual scan responses are the same
2023/02/16 18:47:58 Setting JSON docs in KV
2023/02/16 18:48:07 Using n1ql client
2023/02/16 18:48:08 Expected and Actual scan responses are the same
--- PASS: TestDropDeferredIndexWhileOthersBuilding (56.45s)
=== RUN   TestDropBuildingDeferredIndex
2023/02/16 18:48:08 In TestDropBuildingDeferredIndex()
2023/02/16 18:48:08 In DropAllSecondaryIndexes()
2023/02/16 18:48:08 Index found:  id_company
2023/02/16 18:48:08 Dropped index id_company
2023/02/16 18:48:08 Index found:  id_gender
2023/02/16 18:48:08 Dropped index id_gender
2023/02/16 18:48:08 Index found:  id_age
2023/02/16 18:48:08 Dropped index id_age
2023/02/16 18:48:11 Setting JSON docs in KV
2023/02/16 18:48:18 Build command issued for the deferred indexes [17282825158866843405 17086667604116169317]
2023/02/16 18:48:19 Dropping the secondary index id_age
2023/02/16 18:48:19 Index dropped
2023/02/16 18:48:19 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:20 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:21 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:22 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:23 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:24 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:25 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:26 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:27 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:28 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:29 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:30 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:31 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:32 Waiting for index 17282825158866843405 to go active ...
2023/02/16 18:48:33 Index is 17282825158866843405 now active
2023/02/16 18:48:33 Build command issued for the deferred indexes [id_gender], bucket: default, scope: _default, coll: _default
2023/02/16 18:48:33 Waiting for the index id_gender to become active
2023/02/16 18:48:33 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:34 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:35 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:36 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:37 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:38 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:39 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:40 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:41 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:42 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:43 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:44 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:45 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:46 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:47 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:48 Waiting for index 9108093117309956858 to go active ...
2023/02/16 18:48:49 Index is 9108093117309956858 now active
2023/02/16 18:48:51 Using n1ql client
2023/02/16 18:48:51 Expected and Actual scan responses are the same
2023/02/16 18:48:51 Using n1ql client
2023/02/16 18:48:52 Expected and Actual scan responses are the same
2023/02/16 18:48:54 Setting JSON docs in KV
2023/02/16 18:49:03 Using n1ql client
2023/02/16 18:49:05 Expected and Actual scan responses are the same
--- PASS: TestDropBuildingDeferredIndex (57.18s)
=== RUN   TestDropMultipleBuildingDeferredIndexes
2023/02/16 18:49:05 In TestDropMultipleBuildingDeferredIndexes()
2023/02/16 18:49:05 In DropAllSecondaryIndexes()
2023/02/16 18:49:05 Index found:  id_company
2023/02/16 18:49:05 Dropped index id_company
2023/02/16 18:49:05 Index found:  id_gender
2023/02/16 18:49:05 Dropped index id_gender
2023/02/16 18:49:14 Setting JSON docs in KV
2023/02/16 18:49:50 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:49:50 Index is 5634333831053254878 now active
2023/02/16 18:49:52 Build command issued for the deferred indexes [14671653465319225768 6262139383912428689]
2023/02/16 18:49:53 Dropping the secondary index id_age
2023/02/16 18:49:53 Index dropped
2023/02/16 18:49:53 Dropping the secondary index id_gender
2023/02/16 18:50:11 Index dropped
2023/02/16 18:50:11 Build command issued for the deferred indexes [id_isActive], bucket: default, scope: _default, coll: _default
2023/02/16 18:50:11 Waiting for the index id_isActive to become active
2023/02/16 18:50:11 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:12 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:13 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:14 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:15 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:16 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:17 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:18 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:19 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:20 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:21 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:22 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:23 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:24 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:25 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:26 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:27 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:28 Waiting for index 3809057939372579685 to go active ...
2023/02/16 18:50:29 Index is 3809057939372579685 now active
2023/02/16 18:50:39 Using n1ql client
2023/02/16 18:50:41 Expected and Actual scan responses are the same
2023/02/16 18:50:41 Number of docScanResults and scanResults = 180000 and 180000
2023/02/16 18:50:41 Using n1ql client
2023/02/16 18:50:42 Expected and Actual scan responses are the same
2023/02/16 18:50:42 Number of docScanResults and scanResults = 180000 and 180000
--- PASS: TestDropMultipleBuildingDeferredIndexes (97.18s)
=== RUN   TestDropOneIndexSecondDeferBuilding
2023/02/16 18:50:42 In TestDropOneIndexSecondDeferBuilding()
2023/02/16 18:50:42 In DropAllSecondaryIndexes()
2023/02/16 18:50:42 Index found:  id_isActive
2023/02/16 18:50:42 Dropped index id_isActive
2023/02/16 18:50:42 Index found:  id_company
2023/02/16 18:50:42 Dropped index id_company
2023/02/16 18:50:45 Setting JSON docs in KV
2023/02/16 18:50:51 Build command issued for the deferred indexes [id_company], bucket: default, scope: _default, coll: _default
2023/02/16 18:50:51 Waiting for the index id_company to become active
2023/02/16 18:50:51 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:52 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:53 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:54 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:55 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:56 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:57 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:58 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:50:59 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:00 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:01 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:02 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:03 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:04 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:05 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:06 Waiting for index 14733393294002379958 to go active ...
2023/02/16 18:51:07 Index is 14733393294002379958 now active
2023/02/16 18:51:07 Build command issued for the deferred indexes [16977729336498579324]
2023/02/16 18:51:08 Dropping the secondary index id_company
2023/02/16 18:51:08 Index dropped
2023/02/16 18:51:13 Setting JSON docs in KV
2023/02/16 18:51:32 Waiting for index 16977729336498579324 to go active ...
2023/02/16 18:51:33 Index is 16977729336498579324 now active
2023/02/16 18:51:33 Build command issued for the deferred indexes [id_gender], bucket: default, scope: _default, coll: _default
2023/02/16 18:51:33 Waiting for the index id_gender to become active
2023/02/16 18:51:33 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:34 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:35 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:36 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:37 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:38 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:39 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:40 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:41 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:42 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:43 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:44 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:45 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:46 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:47 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:48 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:49 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:50 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:51 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:52 Waiting for index 3641392862217281709 to go active ...
2023/02/16 18:51:53 Index is 3641392862217281709 now active
2023/02/16 18:51:53 Using n1ql client
2023/02/16 18:51:53 Expected and Actual scan responses are the same
2023/02/16 18:51:53 Using n1ql client
2023/02/16 18:51:54 Expected and Actual scan responses are the same
--- PASS: TestDropOneIndexSecondDeferBuilding (72.13s)
=== RUN   TestDropSecondIndexSecondDeferBuilding
2023/02/16 18:51:54 In TestDropSecondIndexSecondDeferBuilding()
2023/02/16 18:51:54 In DropAllSecondaryIndexes()
2023/02/16 18:51:54 Index found:  id_age
2023/02/16 18:51:54 Dropped index id_age
2023/02/16 18:51:54 Index found:  id_gender
2023/02/16 18:51:54 Dropped index id_gender
2023/02/16 18:51:57 Setting JSON docs in KV
2023/02/16 18:52:03 Build command issued for the deferred indexes [id_company], bucket: default, scope: _default, coll: _default
2023/02/16 18:52:03 Waiting for the index id_company to become active
2023/02/16 18:52:03 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:04 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:05 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:06 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:07 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:08 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:09 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:10 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:11 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:12 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:13 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:14 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:15 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:16 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:17 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:18 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:19 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:20 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:21 Waiting for index 1010876246557460884 to go active ...
2023/02/16 18:52:22 Index is 1010876246557460884 now active
2023/02/16 18:52:22 Build command issued for the deferred indexes [4561481739415740566]
2023/02/16 18:52:23 Dropping the secondary index id_age
2023/02/16 18:52:23 Index dropped
2023/02/16 18:52:26 Setting JSON docs in KV
2023/02/16 18:52:36 Build command issued for the deferred indexes [id_gender], bucket: default, scope: _default, coll: _default
2023/02/16 18:52:36 Waiting for the index id_gender to become active
2023/02/16 18:52:36 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:37 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:38 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:39 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:40 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:41 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:42 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:43 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:44 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:45 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:46 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:47 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:48 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:49 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:50 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:51 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:52 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:53 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:54 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:55 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:56 Waiting for index 16542362454009266647 to go active ...
2023/02/16 18:52:57 Index is 16542362454009266647 now active
2023/02/16 18:52:58 Using n1ql client
2023/02/16 18:52:58 Expected and Actual scan responses are the same
2023/02/16 18:52:58 Using n1ql client
2023/02/16 18:53:00 Expected and Actual scan responses are the same
--- PASS: TestDropSecondIndexSecondDeferBuilding (65.40s)
=== RUN   TestCreateAfterDropWhileIndexBuilding
2023/02/16 18:53:00 In TestCreateAfterDropWhileIndexBuilding()
2023/02/16 18:53:00 In DropAllSecondaryIndexes()
2023/02/16 18:53:00 Index found:  id_gender
2023/02/16 18:53:00 Dropped index id_gender
2023/02/16 18:53:00 Index found:  id_company
2023/02/16 18:53:00 Dropped index id_company
2023/02/16 18:53:23 Setting JSON docs in KV
2023/02/16 18:54:25 Build command issued for the deferred indexes [17729746618687277487]
2023/02/16 18:54:26 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:27 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:28 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:29 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:30 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:31 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:32 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:33 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:34 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:35 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:36 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:37 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:38 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:39 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:40 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:41 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:42 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:43 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:44 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:45 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:46 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:47 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:48 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:49 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:50 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:51 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:52 Waiting for index 17729746618687277487 to go active ...
2023/02/16 18:54:53 Index is 17729746618687277487 now active
2023/02/16 18:54:53 Build command issued for the deferred indexes [9943312888691523154]
2023/02/16 18:54:54 Dropping the secondary index id_company
2023/02/16 18:54:54 Index dropped
2023/02/16 18:54:54 Dropping the secondary index id_age
2023/02/16 18:54:55 Index dropped
2023/02/16 18:55:02 Build command issued for the deferred indexes [id_gender], bucket: default, scope: _default, coll: _default
2023/02/16 18:55:02 Waiting for the index id_gender to become active
2023/02/16 18:55:02 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:03 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:04 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:05 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:06 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:07 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:08 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:09 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:10 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:11 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:12 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:13 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:14 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:15 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:16 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:17 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:18 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:19 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:20 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:21 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:22 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:23 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:24 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:25 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:26 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:27 Waiting for index 11736605722088051215 to go active ...
2023/02/16 18:55:28 Index is 11736605722088051215 now active
2023/02/16 18:55:29 Index is 11736605722088051215 now active
2023/02/16 18:55:30 Using n1ql client
2023/02/16 18:55:31 Expected and Actual scan responses are the same
--- PASS: TestCreateAfterDropWhileIndexBuilding (151.60s)
=== RUN   TestDropBuildingIndex1
2023/02/16 18:55:31 In TestDropBuildingIndex1()
2023/02/16 18:55:31 In DropAllSecondaryIndexes()
2023/02/16 18:55:31 Index found:  id_gender
2023/02/16 18:55:31 Dropped index id_gender
2023/02/16 18:55:36 Setting JSON docs in KV
2023/02/16 18:56:14 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:56:14 Index is 16726025114418335031 now active
2023/02/16 18:56:45 Dropping the secondary index id_age
2023/02/16 18:56:45 Index dropped
2023/02/16 18:57:12 Created the secondary index id_age. Waiting for it become active
2023/02/16 18:57:12 Index is 5946773829535864135 now active
2023/02/16 18:57:14 Setting JSON docs in KV
2023/02/16 18:57:24 Using n1ql client
2023/02/16 18:57:25 Expected and Actual scan responses are the same
2023/02/16 18:57:25 Using n1ql client
2023/02/16 18:57:25 Expected and Actual scan responses are the same
--- PASS: TestDropBuildingIndex1 (114.14s)
=== RUN   TestDropBuildingIndex2
2023/02/16 18:57:25 In TestDropBuildingIndex2()
2023/02/16 18:57:25 In DropAllSecondaryIndexes()
2023/02/16 18:57:25 Index found:  id_company
2023/02/16 18:57:26 Dropped index id_company
2023/02/16 18:57:26 Index found:  id_age
2023/02/16 18:57:26 Dropped index id_age
2023/02/16 18:57:31 Setting JSON docs in KV
2023/02/16 18:58:13 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:58:13 Index is 1444538498957878140 now active
2023/02/16 18:58:48 Dropping the secondary index id_company
2023/02/16 18:58:48 Index dropped
2023/02/16 18:58:48 Index is 7561501695624036739 now active
2023/02/16 18:59:17 Created the secondary index id_company. Waiting for it become active
2023/02/16 18:59:17 Index is 12738784254270542715 now active
2023/02/16 18:59:20 Setting JSON docs in KV
2023/02/16 18:59:30 Using n1ql client
2023/02/16 18:59:30 Expected and Actual scan responses are the same
2023/02/16 18:59:31 Using n1ql client
2023/02/16 18:59:31 Expected and Actual scan responses are the same
--- PASS: TestDropBuildingIndex2 (125.61s)
=== RUN   TestDropIndexWithDataLoad
2023/02/16 18:59:31 In TestDropIndexWithDataLoad()
2023/02/16 18:59:31 In DropAllSecondaryIndexes()
2023/02/16 18:59:31 Index found:  id_age
2023/02/16 18:59:31 Dropped index id_age
2023/02/16 18:59:31 Index found:  id_company
2023/02/16 18:59:31 Dropped index id_company
2023/02/16 18:59:34 Setting JSON docs in KV
2023/02/16 19:00:10 Created the secondary index id_company. Waiting for it become active
2023/02/16 19:00:10 Index is 14737039475941196895 now active
2023/02/16 19:00:43 Created the secondary index id_age. Waiting for it become active
2023/02/16 19:00:43 Index is 12424892329610708096 now active
2023/02/16 19:01:16 Created the secondary index id_gender. Waiting for it become active
2023/02/16 19:01:16 Index is 16656410138016502094 now active
2023/02/16 19:01:49 Created the secondary index id_isActive. Waiting for it become active
2023/02/16 19:01:49 Index is 11954081077960788913 now active
2023/02/16 19:01:57 Setting JSON docs in KV
2023/02/16 19:01:57 In LoadKVBucket
2023/02/16 19:01:57 Bucket name = default
2023/02/16 19:01:57 In DropIndexWhileKVLoad
2023/02/16 19:01:58 Dropping the secondary index id_company
2023/02/16 19:01:58 Index dropped
2023/02/16 19:02:27 Using n1ql client
2023/02/16 19:02:28 Expected and Actual scan responses are the same
2023/02/16 19:02:28 Number of docScanResults and scanResults = 96712 and 96712
2023/02/16 19:02:29 Using n1ql client
2023/02/16 19:02:32 Expected and Actual scan responses are the same
2023/02/16 19:02:32 Number of docScanResults and scanResults = 420000 and 420000
--- PASS: TestDropIndexWithDataLoad (180.70s)
=== RUN   TestDropAllIndexesWithDataLoad
2023/02/16 19:02:32 In TestDropAllIndexesWithDataLoad()
2023/02/16 19:02:32 In DropAllSecondaryIndexes()
2023/02/16 19:02:32 Index found:  id_gender
2023/02/16 19:02:32 Dropped index id_gender
2023/02/16 19:02:32 Index found:  id_isActive
2023/02/16 19:02:32 Dropped index id_isActive
2023/02/16 19:02:32 Index found:  id_age
2023/02/16 19:02:32 Dropped index id_age
2023/02/16 19:02:34 Setting JSON docs in KV
2023/02/16 19:03:18 Created the secondary index id_company. Waiting for it become active
2023/02/16 19:03:18 Index is 9857000721050991015 now active
2023/02/16 19:03:54 Created the secondary index id_age. Waiting for it become active
2023/02/16 19:03:54 Index is 7392984440017231348 now active
2023/02/16 19:04:28 Created the secondary index id_gender. Waiting for it become active
2023/02/16 19:04:28 Index is 14991029403089184458 now active
2023/02/16 19:05:07 Created the secondary index id_isActive. Waiting for it become active
2023/02/16 19:05:07 Index is 10344397006977504264 now active
2023/02/16 19:05:15 Setting JSON docs in KV
2023/02/16 19:05:15 In LoadKVBucket
2023/02/16 19:05:15 In DropIndexWhileKVLoad
2023/02/16 19:05:15 In DropIndexWhileKVLoad
2023/02/16 19:05:15 In DropIndexWhileKVLoad
2023/02/16 19:05:15 Bucket name = default
2023/02/16 19:05:15 In DropIndexWhileKVLoad
2023/02/16 19:05:16 Dropping the secondary index id_gender
2023/02/16 19:05:16 Dropping the secondary index id_isActive
2023/02/16 19:05:16 Dropping the secondary index id_age
2023/02/16 19:05:16 Dropping the secondary index id_company
2023/02/16 19:05:16 Index dropped
2023/02/16 19:05:17 Index dropped
2023/02/16 19:05:17 Index dropped
2023/02/16 19:05:17 Index dropped
2023/02/16 19:05:34 Using n1ql client
2023/02/16 19:05:34 Scan failed as expected with error: Index Not Found - cause: GSI index id_company not found.
--- PASS: TestDropAllIndexesWithDataLoad (182.64s)
=== RUN   TestCreateBucket_AnotherIndexBuilding
2023/02/16 19:05:34 In TestCreateBucket_AnotherIndexBuilding()
2023/02/16 19:05:34 In DropAllSecondaryIndexes()
2023/02/16 19:06:13 Flushed the bucket default, Response body: 
2023/02/16 19:06:16 Modified parameters of bucket default, responseBody: 
2023/02/16 19:06:16 http://127.0.0.1:9000/pools/default/buckets/multi_buck2
2023/02/16 19:06:16 &{DELETE http://127.0.0.1:9000/pools/default/buckets/multi_buck2 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:06:16 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:36:16 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc03b4dde00 31 [] false false map[] 0xc0000a3c00 }
2023/02/16 19:06:16 DeleteBucket failed for bucket multi_buck2 
2023/02/16 19:06:16 Deleted bucket multi_buck2, responseBody: Requested resource not found.
2023/02/16 19:06:31 Setting JSON docs in KV
2023/02/16 19:08:47 Created bucket multi_buck2, responseBody: 
2023/02/16 19:09:12 Index is 324203523781533661 now active
2023/02/16 19:09:12 Index is 10825960040766424393 now active
2023/02/16 19:09:12 Using n1ql client
2023-02-16T19:09:12.113+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:09:12.113+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T19:09:12.115+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:09:12.115+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T19:09:12.117+05:30 [Info] metadata provider version changed 1233 -> 1234
2023-02-16T19:09:12.117+05:30 [Info] switched currmeta from 1233 -> 1234 force false 
2023-02-16T19:09:12.117+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:09:12.117+05:30 [Info] GSIC[default/multi_buck2-_default-_default-1676554752107028255] started ...
2023/02/16 19:09:12 Expected and Actual scan responses are the same
2023/02/16 19:09:12 Number of docScanResults and scanResults = 10000 and 10000
2023/02/16 19:09:12 Using n1ql client
2023/02/16 19:09:16 Expected and Actual scan responses are the same
2023/02/16 19:09:16 Number of docScanResults and scanResults = 200000 and 200000
2023/02/16 19:09:18 Deleted bucket multi_buck2, responseBody: 
2023/02/16 19:09:55 Flushed the bucket default, Response body: 
--- PASS: TestCreateBucket_AnotherIndexBuilding (260.50s)
=== RUN   TestDropBucket2Index_Bucket1IndexBuilding
2023/02/16 19:09:55 In TestDropBucket2Index_Bucket1IndexBuilding()
2023/02/16 19:09:55 In DropAllSecondaryIndexes()
2023/02/16 19:09:55 Index found:  buck1_idx
2023/02/16 19:09:55 Dropped index buck1_idx
2023/02/16 19:10:33 Flushed the bucket default, Response body: 
2023/02/16 19:10:36 Modified parameters of bucket default, responseBody: 
2023/02/16 19:10:36 http://127.0.0.1:9000/pools/default/buckets/multibucket_test3
2023/02/16 19:10:36 &{DELETE http://127.0.0.1:9000/pools/default/buckets/multibucket_test3 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:10:36 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:40:36 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc0092348c0 31 [] false false map[] 0xc0053c8000 }
2023/02/16 19:10:36 DeleteBucket failed for bucket multibucket_test3 
2023/02/16 19:10:36 Deleted bucket multibucket_test3, responseBody: Requested resource not found.
2023/02/16 19:10:36 Created bucket multibucket_test3, responseBody: 
2023/02/16 19:10:51 Setting JSON docs in KV
2023/02/16 19:11:56 Created the secondary index buck2_idx. Waiting for it become active
2023/02/16 19:11:56 Index is 9112980297980095264 now active
2023/02/16 19:12:10 Dropping the secondary index buck2_idx
2023/02/16 19:12:10 Index dropped
2023/02/16 19:12:10 Index is 3360264308840686423 now active
2023/02/16 19:12:11 Using n1ql client
2023/02/16 19:12:12 Expected and Actual scan responses are the same
2023/02/16 19:12:12 Number of docScanResults and scanResults = 100000 and 100000
2023/02/16 19:12:14 Deleted bucket multibucket_test3, responseBody: 
2023/02/16 19:12:52 Flushed the bucket default, Response body: 
--- PASS: TestDropBucket2Index_Bucket1IndexBuilding (176.81s)
=== RUN   TestDeleteBucketWhileInitialIndexBuild
2023/02/16 19:12:52 In TestDeleteBucketWhileInitialIndexBuild()
2023/02/16 19:12:52 ============== DBG: Drop all indexes in all buckets
2023/02/16 19:12:52 In DropAllSecondaryIndexes()
2023/02/16 19:12:52 Index found:  buck1_idx
2023/02/16 19:12:52 Dropped index buck1_idx
2023/02/16 19:12:52 ============== DBG: Delete bucket default
2023/02/16 19:12:54 Deleted bucket default, responseBody: 
2023/02/16 19:12:54 ============== DBG: Create bucket default
2023/02/16 19:12:54 Created bucket default, responseBody: 
2023/02/16 19:12:57 Flush Enabled on bucket default, responseBody: 
2023/02/16 19:13:30 Flushed the bucket default, Response body: 
2023/02/16 19:13:30 ============== DBG: Delete bucket testbucket2
2023/02/16 19:13:30 http://127.0.0.1:9000/pools/default/buckets/testbucket2
2023/02/16 19:13:30 &{DELETE http://127.0.0.1:9000/pools/default/buckets/testbucket2 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:13:30 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:43:30 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc008c814c0 31 [] false false map[] 0xc008f0e300 }
2023/02/16 19:13:30 DeleteBucket failed for bucket testbucket2 
2023/02/16 19:13:30 Deleted bucket testbucket2, responseBody: Requested resource not found.
2023/02/16 19:13:30 ============== DBG: Create bucket testbucket2
2023/02/16 19:13:30 Created bucket testbucket2, responseBody: 
2023/02/16 19:13:33 Flush Enabled on bucket testbucket2, responseBody: 
2023/02/16 19:14:07 Flushed the bucket testbucket2, Response body: 
2023/02/16 19:14:07 ============== DBG: Delete bucket testbucket3
2023/02/16 19:14:07 http://127.0.0.1:9000/pools/default/buckets/testbucket3
2023/02/16 19:14:07 &{DELETE http://127.0.0.1:9000/pools/default/buckets/testbucket3 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:14:07 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:44:07 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc013cd91c0 31 [] false false map[] 0xc000f7a200 }
2023/02/16 19:14:07 DeleteBucket failed for bucket testbucket3 
2023/02/16 19:14:07 Deleted bucket testbucket3, responseBody: Requested resource not found.
2023/02/16 19:14:07 ============== DBG: Create bucket testbucket3
2023/02/16 19:14:07 Created bucket testbucket3, responseBody: 
2023/02/16 19:14:11 Flush Enabled on bucket testbucket3, responseBody: 
2023/02/16 19:14:44 Flushed the bucket testbucket3, Response body: 
2023/02/16 19:14:44 ============== DBG: Delete bucket testbucket4
2023/02/16 19:14:44 http://127.0.0.1:9000/pools/default/buckets/testbucket4
2023/02/16 19:14:44 &{DELETE http://127.0.0.1:9000/pools/default/buckets/testbucket4 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:14:44 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:44:43 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc0117ac740 31 [] false false map[] 0xc000d48200 }
2023/02/16 19:14:44 DeleteBucket failed for bucket testbucket4 
2023/02/16 19:14:44 Deleted bucket testbucket4, responseBody: Requested resource not found.
2023/02/16 19:14:44 ============== DBG: Create bucket testbucket4
2023/02/16 19:14:44 Created bucket testbucket4, responseBody: 
2023/02/16 19:14:47 Flush Enabled on bucket testbucket4, responseBody: 
2023/02/16 19:15:20 Flushed the bucket testbucket4, Response body: 
2023/02/16 19:15:35 Generating docs and Populating all the buckets
2023/02/16 19:15:36 ============== DBG: Creating docs in bucket default
2023/02/16 19:15:36 ============== DBG: Creating index bucket1_age in bucket default
2023/02/16 19:15:40 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 19:15:40 Index is 6168516633622705631 now active
2023/02/16 19:15:40 ============== DBG: Creating index bucket1_gender in bucket default
2023/02/16 19:15:47 Created the secondary index bucket1_gender. Waiting for it become active
2023/02/16 19:15:47 Index is 15954180078975028202 now active
2023/02/16 19:15:47 ============== DBG: Creating docs in bucket testbucket2
2023/02/16 19:15:48 ============== DBG: Creating index bucket2_city in bucket testbucket2
2023/02/16 19:15:53 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 19:15:53 Index is 360180381090101172 now active
2023/02/16 19:15:53 ============== DBG: Creating index bucket2_company in bucket testbucket2
2023/02/16 19:16:00 Created the secondary index bucket2_company. Waiting for it become active
2023/02/16 19:16:00 Index is 6509859966174557419 now active
2023/02/16 19:16:00 ============== DBG: Creating docs in bucket testbucket3
2023/02/16 19:16:01 ============== DBG: Creating index bucket3_gender in bucket testbucket3
2023/02/16 19:16:05 Created the secondary index bucket3_gender. Waiting for it become active
2023/02/16 19:16:05 Index is 12438139961835124155 now active
2023/02/16 19:16:05 ============== DBG: Creating index bucket3_address in bucket testbucket3
2023/02/16 19:16:11 Created the secondary index bucket3_address. Waiting for it become active
2023/02/16 19:16:11 Index is 7967699574415183670 now active
2023/02/16 19:16:11 ============== DBG: First bucket scan:: Scanning index bucket1_age in bucket default
2023/02/16 19:16:11 Using n1ql client
2023-02-16T19:16:11.934+05:30 [Info] metadata provider version changed 1292 -> 1293
2023-02-16T19:16:11.934+05:30 [Info] switched currmeta from 1292 -> 1293 force false 
2023-02-16T19:16:11.934+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:16:11.934+05:30 [Info] GSIC[default/default-_default-_default-1676555171927436423] started ...
2023/02/16 19:16:11 ============== DBG: First bucket scan:: Expected results = 294 Actual results = 294
2023/02/16 19:16:11 Expected and Actual scan responses are the same
2023/02/16 19:16:23 ============== DBG: Creating 50K docs in bucket testbucket4
2023/02/16 19:16:52 ============== DBG: Creating index bucket4_balance asynchronously in bucket testbucket4
2023/02/16 19:17:02 ============== DBG: Deleting bucket testbucket4
2023/02/16 19:17:04 Deleted bucket testbucket4, responseBody: 
2023/02/16 19:17:04 ============== DBG: First bucket scan:: Scanning index bucket1_age in bucket default
2023/02/16 19:17:04 Using n1ql client
2023/02/16 19:17:04 ============== DBG: First bucket scan:: Expected results = 294 Actual results = 294
2023/02/16 19:17:04 Expected and Actual scan responses are the same
2023/02/16 19:17:04 ============== DBG: Second bucket scan:: Scanning index bucket2_city in bucket testbucket2
2023/02/16 19:17:04 Using n1ql client
2023-02-16T19:17:04.899+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:17:04.900+05:30 [Info] GSIC[default/testbucket2-_default-_default-1676555224897610451] started ...
2023/02/16 19:17:04 ============== DBG: Second bucket scan:: Expected results = 392 Actual results = 392
2023/02/16 19:17:04 Expected and Actual scan responses are the same
2023/02/16 19:17:04 ============== DBG: Third bucket scan:: Scanning index bucket3_gender in bucket testbucket3
2023/02/16 19:17:04 Using n1ql client
2023-02-16T19:17:04.907+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:17:04.908+05:30 [Info] GSIC[default/testbucket3-_default-_default-1676555224905751032] started ...
2023/02/16 19:17:04 ============== DBG: Third bucket scan:: Expected results = 492 Actual results = 492
2023/02/16 19:17:04 Expected and Actual scan responses are the same
2023/02/16 19:17:04 ============== DBG: Deleting buckets testbucket2 testbucket3 testbucket4
2023/02/16 19:17:07 Deleted bucket testbucket2, responseBody: 
2023/02/16 19:17:09 Deleted bucket testbucket3, responseBody: 
2023/02/16 19:17:09 http://127.0.0.1:9000/pools/default/buckets/testbucket4
2023/02/16 19:17:09 &{DELETE http://127.0.0.1:9000/pools/default/buckets/testbucket4 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000136000}
2023/02/16 19:17:09 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 13:47:09 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc014fc4cc0 31 [] false false map[] 0xc0150bc500 }
2023/02/16 19:17:09 DeleteBucket failed for bucket testbucket4 
2023/02/16 19:17:09 Deleted bucket testbucket4, responseBody: Requested resource not found.
2023/02/16 19:17:12 Modified parameters of bucket default, responseBody: 
--- PASS: TestDeleteBucketWhileInitialIndexBuild (275.71s)
=== RUN   TestWherClause_UpdateDocument
2023/02/16 19:17:27 In TestWherClause_UpdateDocument()
2023/02/16 19:17:27 In DropAllSecondaryIndexes()
2023/02/16 19:17:27 Index found:  bucket1_age
2023/02/16 19:17:27 Dropped index bucket1_age
2023/02/16 19:17:27 Index found:  bucket1_gender
2023/02/16 19:17:28 Dropped index bucket1_gender
2023/02/16 19:18:06 Flushed the bucket default, Response body: 
2023/02/16 19:18:08 Setting JSON docs in KV
2023/02/16 19:18:17 Created the secondary index id_ageGreaterThan40. Waiting for it become active
2023/02/16 19:18:17 Index is 6121624221717223053 now active
2023/02/16 19:18:17 Using n1ql client
2023/02/16 19:18:17 Expected and Actual scan responses are the same
2023/02/16 19:18:17 Number of docScanResults and scanResults = 5981 and 5981
2023/02/16 19:18:22 Using n1ql client
2023/02/16 19:18:22 Expected and Actual scan responses are the same
2023/02/16 19:18:22 Number of docScanResults and scanResults = 1981 and 1981
--- PASS: TestWherClause_UpdateDocument (54.53s)
=== RUN   TestDeferFalse
2023/02/16 19:18:22 In TestDeferFalse()
2023/02/16 19:18:25 Setting JSON docs in KV
2023/02/16 19:18:40 Created the secondary index index_deferfalse1. Waiting for it become active
2023/02/16 19:18:40 Index is 13869854558879461799 now active
2023/02/16 19:18:40 Using n1ql client
2023/02/16 19:18:40 Expected and Actual scan responses are the same
--- PASS: TestDeferFalse (17.78s)
=== RUN   TestDeferFalse_CloseClientConnection
2023/02/16 19:18:40 In TestDeferFalse_CloseClientConnection()
2023/02/16 19:18:40 In CloseClientThread
2023/02/16 19:18:40 In CreateIndexThread
2023/02/16 19:18:42 Create Index call failed as expected due to error : Terminate Request due to client termination
2023/02/16 19:18:42 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:43 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:44 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:45 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:46 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:47 Waiting for index 7010857540001286808 to go active ...
2023/02/16 19:18:48 Index is 7010857540001286808 now active
2023/02/16 19:18:48 Using n1ql client
2023/02/16 19:18:48 Expected and Actual scan responses are the same
--- PASS: TestDeferFalse_CloseClientConnection (8.14s)
=== RUN   TestOrphanIndexCleanup
2023/02/16 19:18:48 In DropAllSecondaryIndexes()
2023/02/16 19:18:48 Index found:  id_ageGreaterThan40
2023/02/16 19:18:48 Dropped index id_ageGreaterThan40
2023/02/16 19:18:48 Index found:  index_deferfalse2
2023/02/16 19:18:48 Dropped index index_deferfalse2
2023/02/16 19:18:48 Index found:  index_deferfalse1
2023/02/16 19:18:48 Dropped index index_deferfalse1
2023/02/16 19:19:03 Created the secondary index idx1_age_regular. Waiting for it become active
2023/02/16 19:19:03 Index is 18173309907888789443 now active
2023/02/16 19:19:11 Created the secondary index idx2_company_regular. Waiting for it become active
2023/02/16 19:19:11 Index is 14179852203267208274 now active
2023/02/16 19:19:21 Using n1ql client
2023/02/16 19:19:21 Query on idx1_age_regular is successful
2023/02/16 19:19:21 Using n1ql client
2023/02/16 19:19:21 Query on idx2_company_regular is successful
Restarting indexer process ...
2023/02/16 19:19:21 []
2023-02-16T19:19:21.186+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T19:19:21.186+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 19:19:41 Using n1ql client
2023-02-16T19:19:41.088+05:30 [Error] transport error between 127.0.0.1:35918->127.0.0.1:9107: write tcp 127.0.0.1:35918->127.0.0.1:9107: write: broken pipe
2023-02-16T19:19:41.088+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 7365388488588259652 request transport failed `write tcp 127.0.0.1:35918->127.0.0.1:9107: write: broken pipe`
2023-02-16T19:19:41.088+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T19:19:41.088+05:30 [Error] metadataClient:PickRandom: Replicas - [6342163499874408144], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 19:19:41 Query on idx1_age_regular is successful - after indexer restart.
2023/02/16 19:19:41 Using n1ql client
2023/02/16 19:19:41 Query on idx2_company_regular is successful - after indexer restart.
--- PASS: TestOrphanIndexCleanup (52.77s)
=== RUN   TestOrphanPartitionCleanup
2023/02/16 19:19:46 Created the secondary index idx3_age_regular. Waiting for it become active
2023/02/16 19:19:46 Index is 2652934214639248108 now active
2023/02/16 19:19:56 Using n1ql client
2023/02/16 19:19:56 Query on idx3_age_regular is successful
Restarting indexer process ...
2023/02/16 19:19:56 []
2023-02-16T19:19:56.666+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T19:19:56.667+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 19:20:16 Using n1ql client
2023-02-16T19:20:16.631+05:30 [Error] transport error between 127.0.0.1:54210->127.0.0.1:9107: write tcp 127.0.0.1:54210->127.0.0.1:9107: write: broken pipe
2023-02-16T19:20:16.631+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 4950308676650927125 request transport failed `write tcp 127.0.0.1:54210->127.0.0.1:9107: write: broken pipe`
2023-02-16T19:20:16.631+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 8.  Partition with instances 0 
2023-02-16T19:20:16.631+05:30 [Error] metadataClient:PickRandom: Replicas - [9694746868959652093], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 19:20:16 Query on idx3_age_regular is successful - after indexer restart.
--- PASS: TestOrphanPartitionCleanup (35.52s)
=== RUN   TestIndexerSettings
2023/02/16 19:20:16 In TestIndexerSettings()
2023/02/16 19:20:16 Changing config key indexer.settings.max_cpu_percent to value 300
2023/02/16 19:20:16 Changing config key indexer.settings.inmemory_snapshot.interval to value 300
2023/02/16 19:20:17 Changing config key indexer.settings.persisted_snapshot.interval to value 20000
2023/02/16 19:20:17 Changing config key indexer.settings.recovery.max_rollbacks to value 3
2023/02/16 19:20:17 Changing config key indexer.settings.log_level to value error
--- PASS: TestIndexerSettings (0.59s)
=== RUN   TestRestoreDefaultSettings
2023/02/16 19:20:17 In TestIndexerSettings_RestoreDefault()
2023/02/16 19:20:17 Changing config key indexer.settings.max_cpu_percent to value 0
2023/02/16 19:20:17 Changing config key indexer.settings.inmemory_snapshot.interval to value 200
2023/02/16 19:20:17 Changing config key indexer.settings.persisted_snapshot.interval to value 5000
2023/02/16 19:20:17 Changing config key indexer.settings.recovery.max_rollbacks to value 5
2023/02/16 19:20:17 Changing config key indexer.settings.log_level to value info
--- PASS: TestRestoreDefaultSettings (0.55s)
=== RUN   TestStat_ItemsCount
2023/02/16 19:20:17 In TestStat_ItemsCount()
2023/02/16 19:20:17 In DropAllSecondaryIndexes()
2023/02/16 19:20:17 Index found:  idx2_company_regular
2023/02/16 19:20:17 Dropped index idx2_company_regular
2023/02/16 19:20:17 Index found:  idx3_age_regular
2023/02/16 19:20:17 Dropped index idx3_age_regular
2023/02/16 19:20:17 Index found:  idx1_age_regular
2023/02/16 19:20:18 Dropped index idx1_age_regular
2023/02/16 19:20:18 Emptying the default bucket
2023/02/16 19:20:21 Flush Enabled on bucket default, responseBody: 
2023/02/16 19:20:58 Flushed the bucket default, Response body: 
2023/02/16 19:20:58 Generating JSON docs
2023/02/16 19:20:59 Setting initial JSON docs in KV
2023/02/16 19:21:04 Creating a 2i
2023/02/16 19:21:07 Created the secondary index index_test1. Waiting for it become active
2023/02/16 19:21:07 Index is 13898580945338226670 now active
2023/02/16 19:21:12 items_count stat is 10000
--- PASS: TestStat_ItemsCount (54.92s)
=== RUN   TestRangeArrayIndex_Distinct
2023/02/16 19:21:12 In TestRangeArrayIndex_Distinct()
2023/02/16 19:21:12 In DropAllSecondaryIndexes()
2023/02/16 19:21:12 Index found:  index_test1
2023/02/16 19:21:12 Dropped index index_test1
2023/02/16 19:21:50 Flushed the bucket default, Response body: 
2023/02/16 19:21:54 Created the secondary index arridx_friends. Waiting for it become active
2023/02/16 19:21:54 Index is 17595817502308830941 now active
2023-02-16T19:21:54.532+05:30 [Error] transport error between 127.0.0.1:49028->127.0.0.1:9107: write tcp 127.0.0.1:49028->127.0.0.1:9107: write: broken pipe
2023-02-16T19:21:54.532+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"]  request transport failed `write tcp 127.0.0.1:49028->127.0.0.1:9107: write: broken pipe`
2023-02-16T19:21:54.534+05:30 [Warn] scan failed: requestId  queryport 127.0.0.1:9107 inst 16118002596218652028 partition [0]
2023-02-16T19:21:54.534+05:30 [Warn] Scan failed with error for index 17595817502308830941.  Trying scan again with replica, reqId: :  write tcp 127.0.0.1:49028->127.0.0.1:9107: write: broken pipe from [127.0.0.1:9107] ...
2023-02-16T19:21:54.535+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T19:21:54.535+05:30 [Error] metadataClient:PickRandom: Replicas - [16118002596218652028], PrunedReplica - map[], FilteredReplica map[]
2023-02-16T19:21:54.535+05:30 [Warn] Fail to find indexers to satisfy query request.  Trying scan again for index 17595817502308830941, reqId: :  write tcp 127.0.0.1:49028->127.0.0.1:9107: write: broken pipe from [127.0.0.1:9107] ...
2023/02/16 19:21:54 Expected and Actual scan responses are the same
2023/02/16 19:21:57 Expected and Actual scan responses are the same
--- PASS: TestRangeArrayIndex_Distinct (44.75s)
=== RUN   TestUpdateArrayIndex_Distinct
2023/02/16 19:21:57 In TestUpdateArrayIndex_Distinct()
2023/02/16 19:21:57 In DropAllSecondaryIndexes()
2023/02/16 19:21:57 Index found:  arridx_friends
2023/02/16 19:21:57 Dropped index arridx_friends
2023/02/16 19:22:35 Flushed the bucket default, Response body: 
2023/02/16 19:22:39 Created the secondary index arridx_friends. Waiting for it become active
2023/02/16 19:22:39 Index is 14418410607559365121 now active
2023/02/16 19:22:39 Expected and Actual scan responses are the same
2023/02/16 19:22:42 Expected and Actual scan responses are the same
2023/02/16 19:22:43 Expected and Actual scan responses are the same
--- PASS: TestUpdateArrayIndex_Distinct (45.64s)
=== RUN   TestRangeArrayIndex_Duplicate
2023/02/16 19:22:43 In TestRangeArrayIndex_Duplicate()
2023/02/16 19:22:43 In DropAllSecondaryIndexes()
2023/02/16 19:22:43 Index found:  arridx_friends
2023/02/16 19:22:43 Dropped index arridx_friends
2023/02/16 19:23:20 Flushed the bucket default, Response body: 
2023/02/16 19:23:24 Created the secondary index arridx_friends. Waiting for it become active
2023/02/16 19:23:24 Index is 12520601091777160111 now active
2023/02/16 19:23:24 Expected and Actual scan responses are the same
2023/02/16 19:23:27 Expected and Actual scan responses are the same
--- PASS: TestRangeArrayIndex_Duplicate (44.12s)
=== RUN   TestUpdateArrayIndex_Duplicate
2023/02/16 19:23:27 In TestUpdateArrayIndex_Duplicate()
2023/02/16 19:23:27 In DropAllSecondaryIndexes()
2023/02/16 19:23:27 Index found:  arridx_friends
2023/02/16 19:23:27 Dropped index arridx_friends
2023/02/16 19:24:05 Flushed the bucket default, Response body: 
2023/02/16 19:24:09 Created the secondary index arridx_friends. Waiting for it become active
2023/02/16 19:24:09 Index is 12859938266576726385 now active
2023/02/16 19:24:09 Expected and Actual scan responses are the same
2023/02/16 19:24:12 Expected and Actual scan responses are the same
2023/02/16 19:24:12 Expected and Actual scan responses are the same
--- PASS: TestUpdateArrayIndex_Duplicate (45.07s)
=== RUN   TestArrayIndexCornerCases
2023/02/16 19:24:12 In TestArrayIndexCornerCases()
2023/02/16 19:24:16 Created the secondary index arr_single. Waiting for it become active
2023/02/16 19:24:16 Index is 9979793556203566844 now active
2023/02/16 19:24:22 Created the secondary index arr_leading. Waiting for it become active
2023/02/16 19:24:22 Index is 8846806166319020031 now active
2023/02/16 19:24:28 Created the secondary index arr_nonleading. Waiting for it become active
2023/02/16 19:24:28 Index is 2165796218023955964 now active
2023/02/16 19:24:28 

--------ScanAll for EMPTY array--------
2023/02/16 19:24:28 Count of scanResults is 0
2023/02/16 19:24:28 Count of scanResults is 0
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["Ar3Kr" Missing field or index.] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 

--------ScanAll for MISSING array--------
2023/02/16 19:24:28 Count of scanResults is 0
2023/02/16 19:24:28 Count of scanResults is 0
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["quelc" Missing field or index.] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 

--------ScanAll for NULL array--------
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values [null] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values [null "dGEPd"] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["dGEPd" null] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 

--------ScanAll for SCALARVALUE array--------
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["IamScalar"] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["IamScalar" "6iyHQq"] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["6iyHQq" "IamScalar"] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 

--------ScanAll for SCALAROBJECT array--------
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values [{"1":"abc","2":"def"}] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values [{"1":"abc","2":"def"} "QCmOp"] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
2023/02/16 19:24:28 Count of scanResults is 1
2023/02/16 19:24:28 Key: string 6762700763999619893  Value: value.Values ["QCmOp" {"1":"abc","2":"def"}] false
2023/02/16 19:24:28 Expected and Actual scan responses are the same
--- PASS: TestArrayIndexCornerCases (16.27s)
=== RUN   TestArraySizeIncreaseDecrease1
2023/02/16 19:24:28 In TestArraySizeIncreaseDecrease1()
2023/02/16 19:24:28 In DropAllSecondaryIndexes()
2023/02/16 19:24:28 Index found:  arr_leading
2023/02/16 19:24:28 Dropped index arr_leading
2023/02/16 19:24:28 Index found:  arr_nonleading
2023/02/16 19:24:28 Dropped index arr_nonleading
2023/02/16 19:24:28 Index found:  arr_single
2023/02/16 19:24:28 Dropped index arr_single
2023/02/16 19:24:28 Index found:  arridx_friends
2023/02/16 19:24:28 Dropped index arridx_friends
2023/02/16 19:25:06 Flushed the bucket default, Response body: 
2023/02/16 19:25:06 Changing config key indexer.settings.allow_large_keys to value false
2023/02/16 19:25:07 Changing config key indexer.settings.max_seckey_size to value 100
2023/02/16 19:25:07 Changing config key indexer.settings.max_array_seckey_size to value 2000
2023/02/16 19:25:08 Start of createArrayDocs()
2023/02/16 19:25:26 End of createArrayDocs()
2023/02/16 19:25:26 Start of createArrayDocs()
2023/02/16 19:25:26 End of createArrayDocs()
2023/02/16 19:25:29 Created the secondary index arr1. Waiting for it become active
2023/02/16 19:25:29 Index is 15253557073847734213 now active
2023/02/16 19:25:36 Created the secondary index arr2. Waiting for it become active
2023/02/16 19:25:36 Index is 9770906959799353149 now active
2023/02/16 19:25:42 Created the secondary index idx3. Waiting for it become active
2023/02/16 19:25:42 Index is 10464659457571941016 now active
2023/02/16 19:25:42 Using n1ql client
2023/02/16 19:25:42 Length of scanResults = 10
2023/02/16 19:25:42 Changing config key indexer.settings.max_seckey_size to value 4096
2023/02/16 19:25:42 Changing config key indexer.settings.max_array_seckey_size to value 51200
2023/02/16 19:25:50 Expected and Actual scan responses are the same
2023/02/16 19:25:50 Using n1ql client
2023/02/16 19:25:50 Expected and Actual scan responses are the same
2023/02/16 19:25:50 Using n1ql client
2023/02/16 19:25:50 Expected and Actual scan responses are the same
2023/02/16 19:25:50 Changing config key indexer.settings.max_seckey_size to value 100
2023/02/16 19:25:50 Changing config key indexer.settings.max_array_seckey_size to value 2200
2023/02/16 19:25:58 Using n1ql client
2023/02/16 19:25:58 Length of scanResults = 10
2023/02/16 19:25:58 Changing config key indexer.settings.max_seckey_size to value 4608
2023/02/16 19:25:58 Changing config key indexer.settings.max_array_seckey_size to value 10240
--- PASS: TestArraySizeIncreaseDecrease1 (91.34s)
=== RUN   TestArraySizeIncreaseDecrease2
2023/02/16 19:25:59 In TestArraySizeIncreaseDecrease2()
2023/02/16 19:25:59 In DropAllSecondaryIndexes()
2023/02/16 19:25:59 Index found:  arr2
2023/02/16 19:25:59 Dropped index arr2
2023/02/16 19:25:59 Index found:  idx3
2023/02/16 19:26:00 Dropped index idx3
2023/02/16 19:26:00 Index found:  arr1
2023/02/16 19:26:00 Dropped index arr1
2023/02/16 19:26:37 Flushed the bucket default, Response body: 
2023/02/16 19:26:38 Changing config key indexer.settings.allow_large_keys to value true
2023/02/16 19:26:39 Changing config key indexer.settings.max_seckey_size to value 100
2023/02/16 19:26:39 Changing config key indexer.settings.max_array_seckey_size to value 2000
2023/02/16 19:26:40 Start of createArrayDocs()
2023/02/16 19:26:57 End of createArrayDocs()
2023/02/16 19:26:57 Start of createArrayDocs()
2023/02/16 19:26:57 End of createArrayDocs()
2023/02/16 19:27:08 Created the secondary index arr1. Waiting for it become active
2023/02/16 19:27:08 Index is 16674949281090302062 now active
2023/02/16 19:27:14 Created the secondary index arr2. Waiting for it become active
2023/02/16 19:27:14 Index is 3725423326326885845 now active
2023/02/16 19:27:20 Created the secondary index idx3. Waiting for it become active
2023/02/16 19:27:20 Index is 14324526501839659188 now active
2023/02/16 19:27:21 Expected and Actual scan responses are the same
2023/02/16 19:27:21 Using n1ql client
2023/02/16 19:27:21 Expected and Actual scan responses are the same
2023/02/16 19:27:21 Using n1ql client
2023/02/16 19:27:21 Expected and Actual scan responses are the same
2023/02/16 19:27:21 Changing config key indexer.settings.max_seckey_size to value 4096
2023/02/16 19:27:21 Changing config key indexer.settings.max_array_seckey_size to value 51200
2023/02/16 19:27:26 Expected and Actual scan responses are the same
2023/02/16 19:27:26 Using n1ql client
2023/02/16 19:27:26 Expected and Actual scan responses are the same
2023/02/16 19:27:26 Using n1ql client
2023/02/16 19:27:26 Expected and Actual scan responses are the same
2023/02/16 19:27:26 Changing config key indexer.settings.max_seckey_size to value 100
2023/02/16 19:27:26 Changing config key indexer.settings.max_array_seckey_size to value 2200
2023/02/16 19:27:30 Expected and Actual scan responses are the same
2023/02/16 19:27:30 Using n1ql client
2023/02/16 19:27:30 Expected and Actual scan responses are the same
2023/02/16 19:27:30 Using n1ql client
2023/02/16 19:27:30 Expected and Actual scan responses are the same
2023/02/16 19:27:30 Changing config key indexer.settings.max_seckey_size to value 4608
2023/02/16 19:27:30 Changing config key indexer.settings.max_array_seckey_size to value 10240
--- PASS: TestArraySizeIncreaseDecrease2 (91.77s)
=== RUN   TestBufferedScan_BackfillDisabled
2023/02/16 19:27:31 In TestBufferedScan_BackfillDisabled()
2023/02/16 19:27:31 In DropAllSecondaryIndexes()
2023/02/16 19:27:31 Index found:  idx3
2023/02/16 19:27:31 Dropped index idx3
2023/02/16 19:27:31 Index found:  arr1
2023/02/16 19:27:31 Dropped index arr1
2023/02/16 19:27:31 Index found:  arr2
2023/02/16 19:27:31 Dropped index arr2
2023/02/16 19:28:09 Flushed the bucket default, Response body: 
2023/02/16 19:28:48 Changing config key queryport.client.settings.backfillLimit to value 0
2023/02/16 19:28:55 Created the secondary index addressidx. Waiting for it become active
2023/02/16 19:28:55 Index is 3117210484312045066 now active
2023-02-16T19:28:55.815+05:30 [Info] metadata provider version changed 1509 -> 1510
2023-02-16T19:28:55.816+05:30 [Info] switched currmeta from 1509 -> 1510 force false 
2023-02-16T19:28:55.816+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:28:55.816+05:30 [Info] GSIC[default/default-_default-_default-1676555935806138237] started ...
2023-02-16T19:28:55.816+05:30 [Warn] MonitorIndexer: Indexer for default:_default:_default is already being monitored
2023/02/16 19:28:55 Non-backfill file found: /tmp/.ICE-unix
2023/02/16 19:28:55 Non-backfill file found: /tmp/.Test-unix
2023/02/16 19:28:55 Non-backfill file found: /tmp/.X11-unix
2023/02/16 19:28:55 Non-backfill file found: /tmp/.XIM-unix
2023/02/16 19:28:55 Non-backfill file found: /tmp/.font-unix
2023/02/16 19:28:55 Non-backfill file found: /tmp/448e181f-7051-413a-b3fe-b164aafa05e6.sock
2023/02/16 19:28:55 Non-backfill file found: /tmp/TestPause
2023/02/16 19:28:55 Non-backfill file found: /tmp/f_448e181f-7051-413a-b3fe-b164aafa05e6.sock
2023/02/16 19:28:55 Non-backfill file found: /tmp/fail.log
2023/02/16 19:28:55 Non-backfill file found: /tmp/go-build2643775603
2023/02/16 19:28:55 Non-backfill file found: /tmp/mdbslice
2023/02/16 19:28:55 Non-backfill file found: /tmp/systemd-private-6908f2d18abe4679bc187adf3be6e73c-apache2.service-xeEl7u
2023/02/16 19:28:55 Non-backfill file found: /tmp/systemd-private-6908f2d18abe4679bc187adf3be6e73c-systemd-timesyncd.service-yoMRv3
2023-02-16T19:28:55.820+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:28:55.820+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023/02/16 19:28:55 limit=1,chsize=256; received 1 items; took 3.286756ms
2023-02-16T19:28:55.822+05:30 [Info] switched currmeta from 1514 -> 1514 force true 
2023-02-16T19:28:55.826+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:28:55.826+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T19:28:55.829+05:30 [Info] switched currmeta from 1510 -> 1510 force true 
2023-02-16T19:28:57.587+05:30 [Info] serviceChangeNotifier: received PoolChangeNotification
2023-02-16T19:28:57.592+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:28:57.592+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T19:28:57.593+05:30 [Info] Refreshing indexer list due to cluster changes or auto-refresh.
2023-02-16T19:28:57.593+05:30 [Info] Refreshed Indexer List: [127.0.0.1:9106]
2023-02-16T19:28:57.593+05:30 [Info] switched currmeta from 1510 -> 1510 force true 
2023-02-16T19:28:57.596+05:30 [Info] switched currmeta from 1514 -> 1514 force true 
2023/02/16 19:28:58 limit=1000,chsize=256; received 1000 items; took 1.246883178s
2023-02-16T19:28:58.198+05:30 [Info] Rollback time has changed for index inst 17358035790044102662. New rollback time 1676555397983948346
2023-02-16T19:28:58.198+05:30 [Info] Rollback time has changed for index inst 17358035790044102662. New rollback time 1676555397983948346
2023-02-16T19:28:59.069+05:30 [Info] gsiKeyspace::Close Closing default:_default:_default
--- PASS: TestBufferedScan_BackfillDisabled (87.40s)
=== RUN   TestBufferedScan_BackfillEnabled
2023/02/16 19:28:59 In TestBufferedScan_BackfillEnabled()
2023-02-16T19:28:59.169+05:30 [Info] MetadataProvider.CheckIndexerStatus(): adminport=127.0.0.1:9106 connected=true
2023/02/16 19:28:59 Changing config key queryport.client.settings.backfillLimit to value 1
2023-02-16T19:28:59.189+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T19:28:59.189+05:30 [Info] GSIC[default/default-_default-_default-1676555939183653231] started ...
2023-02-16T19:28:59.194+05:30 [Info] New settings received: 
{"indexer.api.enableTestServer":true,"indexer.plasma.backIndex.enableInMemoryCompression":true,"indexer.plasma.backIndex.enablePageBloomFilter":false,"indexer.plasma.mainIndex.enableInMemoryCompression":true,"indexer.settings.allow_large_keys":true,"indexer.settings.bufferPoolBlockSize":16384,"indexer.settings.build.batch_size":5,"indexer.settings.compaction.abort_exceed_interval":false,"indexer.settings.compaction.check_period":30,"indexer.settings.compaction.compaction_mode":"circular","indexer.settings.compaction.days_of_week":"Sunday,Monday,Tuesday,Wednesday,Thursday,Friday,Saturday","indexer.settings.compaction.interval":"00:00,00:00","indexer.settings.compaction.min_frag":30,"indexer.settings.compaction.min_size":524288000,"indexer.settings.compaction.plasma.manual":false,"indexer.settings.compaction.plasma.optional.decrement":5,"indexer.settings.compaction.plasma.optional.min_frag":20,"indexer.settings.compaction.plasma.optional.quota":25,"indexer.settings.corrupt_index_num_backups":1,"indexer.settings.cpuProfDir":"","indexer.settings.cpuProfile":false,"indexer.settings.eTagPeriod":240,"indexer.settings.enable_corrupt_index_backup":false,"indexer.settings.enable_page_bloom_filter":false,"indexer.settings.fast_flush_mode":true,"indexer.settings.gc_percent":100,"indexer.settings.inmemory_snapshot.fdb.interval":200,"indexer.settings.inmemory_snapshot.interval":200,"indexer.settings.inmemory_snapshot.moi.interval":10,"indexer.settings.largeSnapshotThreshold":200,"indexer.settings.log_level":"info","indexer.settings.maxVbQueueLength":0,"indexer.settings.max_array_seckey_size":10240,"indexer.settings.max_cpu_percent":0,"indexer.settings.max_seckey_size":4608,"indexer.settings.max_writer_lock_prob":20,"indexer.settings.memProfDir":"","indexer.settings.memProfile":false,"indexer.settings.memory_quota":1572864000,"indexer.settings.minVbQueueLength":50,"indexer.settings.moi.debug":false,"indexer.settings.moi.persistence_threads":2,"indexer.settings.moi.recovery.max_rollbacks":2,"indexer.settings.moi.recovery_threads":4,"indexer.settings.num_replica":0,"indexer.settings.persisted_snapshot.fdb.interval":5000,"indexer.settings.persisted_snapshot.interval":5000,"indexer.settings.persisted_snapshot.moi.interval":60000,"indexer.settings.persisted_snapshot_init_build.fdb.interval":5000,"indexer.settings.persisted_snapshot_init_build.interval":5000,"indexer.settings.persisted_snapshot_init_build.moi.interval":60000,"indexer.settings.plasma.recovery.max_rollbacks":2,"indexer.settings.rebalance.blob_storage_bucket":"","indexer.settings.rebalance.blob_storage_prefix":"","indexer.settings.rebalance.blob_storage_region":"","indexer.settings.rebalance.blob_storage_scheme":"","indexer.settings.rebalance.redistribute_indexes":false,"indexer.settings.recovery.max_rollbacks":5,"indexer.settings.scan_getseqnos_retries":30,"indexer.settings.scan_timeout":0,"indexer.settings.send_buffer_size":1024,"indexer.settings.serverless.indexLimit":201,"indexer.settings.sliceBufSize":800,"indexer.settings.smallSnapshotThreshold":30,"indexer.settings.snapshotListeners":2,"indexer.settings.snapshotRequestWorkers":2,"indexer.settings.statsLogDumpInterval":60,"indexer.settings.storage_mode":"plasma","indexer.settings.storage_mode.disable_upgrade":true,"indexer.settings.thresholds.mem_high":70,"indexer.settings.thresholds.mem_low":50,"indexer.settings.thresholds.units_high":60,"indexer.settings.thresholds.units_low":40,"indexer.settings.units_quota":10000,"indexer.settings.wal_size":4096,"projector.settings.log_level":"info","queryport.client.log_level":"warn","queryport.client.settings.backfillLimit":1,"queryport.client.settings.minPoolSizeWM":1000,"queryport.client.settings.poolOverflow":30,"queryport.client.settings.poolSize":5000,"queryport.client.settings.relConnBatchSize":100}
2023/02/16 19:28:59 limit=1,chsize=256; received 1 items; took 11.624402ms
2023/02/16 19:29:00 limit=1000,chsize=256; received 1000 items; took 16.582577ms
2023/02/16 19:29:12 limit=1000,chsize=256; received 1000 items; took 10.362400737s
Scan error: bufferedscan temp file size exceeded limit 1, 13 - cause: bufferedscan temp file size exceeded limit 1, 13
Scan error:  bufferedscan temp file size exceeded limit 1, 13 - cause:  bufferedscan temp file size exceeded limit 1, 13
Scan error: bufferedscan temp file size exceeded limit 1, 13 - cause: bufferedscan temp file size exceeded limit 1, 13
Scan error:  bufferedscan temp file size exceeded limit 1, 13 - cause:  bufferedscan temp file size exceeded limit 1, 13
2023/02/16 19:29:26 limit=1000,chsize=256; received 644 items; took 13.3508261s
2023/02/16 19:29:26 limit=1000,chsize=256; received 644 items; took 13.35083412s
2023/02/16 19:29:28 Changing config key queryport.client.settings.backfillLimit to value 0
--- PASS: TestBufferedScan_BackfillEnabled (28.98s)
=== RUN   TestMultiScanSetup
2023/02/16 19:29:28 In TestMultiScanSetup()
2023/02/16 19:29:29 Emptying the default bucket
2023/02/16 19:29:33 Flush Enabled on bucket default, responseBody: 
2023/02/16 19:30:09 Flushed the bucket default, Response body: 
2023/02/16 19:30:09 Populating the default bucket
2023/02/16 19:30:23 Created the secondary index index_companyname. Waiting for it become active
2023/02/16 19:30:23 Index is 17519765663212224104 now active
2023/02/16 19:30:30 Created the secondary index index_company. Waiting for it become active
2023/02/16 19:30:30 Index is 3830533585937572090 now active
2023/02/16 19:30:36 Created the secondary index index_company_name_age. Waiting for it become active
2023/02/16 19:30:36 Index is 7718840463001836012 now active
2023/02/16 19:30:43 Created the secondary index index_primary. Waiting for it become active
2023/02/16 19:30:43 Index is 15627791414161192985 now active
2023/02/16 19:30:49 Created the secondary index index_company_name_age_address. Waiting for it become active
2023/02/16 19:30:49 Index is 11678063306673777449 now active
2023/02/16 19:30:56 Created the secondary index index_company_name_age_address_friends. Waiting for it become active
2023/02/16 19:30:56 Index is 2253126634705892985 now active
--- PASS: TestMultiScanSetup (88.78s)
=== RUN   TestMultiScanCount
2023/02/16 19:30:56 In TestMultiScanCount()
2023/02/16 19:30:56 

--------- Composite Index with 2 fields ---------
2023/02/16 19:30:56 
--- ScanAllNoFilter ---
2023/02/16 19:30:56 distinct = false
2023/02/16 19:30:57 Using n1ql client
2023/02/16 19:30:57 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:30:57 
--- ScanAllFilterNil ---
2023/02/16 19:30:57 distinct = false
2023/02/16 19:30:57 Using n1ql client
2023/02/16 19:30:57 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:30:57 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:30:57 distinct = false
2023/02/16 19:30:58 Using n1ql client
2023/02/16 19:30:58 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:30:58 
--- SingleSeek ---
2023/02/16 19:30:58 distinct = false
2023/02/16 19:30:58 Using n1ql client
2023/02/16 19:30:58 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:30:58 
--- MultipleSeek ---
2023/02/16 19:30:58 distinct = false
2023/02/16 19:30:58 Using n1ql client
2023/02/16 19:30:58 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:30:58 
--- SimpleRange ---
2023/02/16 19:30:58 distinct = false
2023/02/16 19:30:59 Using n1ql client
2023/02/16 19:30:59 MultiScanCount = 2273 ExpectedMultiScanCount = 2273
2023/02/16 19:30:59 
--- NonOverlappingRanges ---
2023/02/16 19:30:59 distinct = false
2023/02/16 19:31:00 Using n1ql client
2023/02/16 19:31:00 MultiScanCount = 4283 ExpectedMultiScanCount = 4283
2023/02/16 19:31:00 
--- OverlappingRanges ---
2023/02/16 19:31:00 distinct = false
2023/02/16 19:31:01 Using n1ql client
2023/02/16 19:31:01 MultiScanCount = 5756 ExpectedMultiScanCount = 5756
2023/02/16 19:31:01 
--- NonOverlappingFilters ---
2023/02/16 19:31:01 distinct = false
2023/02/16 19:31:01 Using n1ql client
2023/02/16 19:31:01 MultiScanCount = 337 ExpectedMultiScanCount = 337
2023/02/16 19:31:01 
--- OverlappingFilters ---
2023/02/16 19:31:01 distinct = false
2023/02/16 19:31:01 Using n1ql client
2023/02/16 19:31:01 MultiScanCount = 2559 ExpectedMultiScanCount = 2559
2023/02/16 19:31:01 
--- BoundaryFilters ---
2023/02/16 19:31:01 distinct = false
2023/02/16 19:31:02 Using n1ql client
2023/02/16 19:31:02 MultiScanCount = 499 ExpectedMultiScanCount = 499
2023/02/16 19:31:02 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:31:02 distinct = false
2023/02/16 19:31:02 Using n1ql client
2023/02/16 19:31:02 MultiScanCount = 256 ExpectedMultiScanCount = 256
2023/02/16 19:31:02 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:31:02 distinct = false
2023/02/16 19:31:03 Using n1ql client
2023/02/16 19:31:03 MultiScanCount = 255 ExpectedMultiScanCount = 255
2023/02/16 19:31:03 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:31:03 distinct = false
2023/02/16 19:31:03 Using n1ql client
2023/02/16 19:31:03 MultiScanCount = 5618 ExpectedMultiScanCount = 5618
2023/02/16 19:31:03 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:31:03 distinct = false
2023/02/16 19:31:03 Using n1ql client
2023/02/16 19:31:03 MultiScanCount = 3704 ExpectedMultiScanCount = 3704
2023/02/16 19:31:03 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:31:03 distinct = false
2023/02/16 19:31:04 Using n1ql client
2023/02/16 19:31:04 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:31:04 
--- FiltersWithUnbounded ---
2023/02/16 19:31:04 distinct = false
2023/02/16 19:31:04 Using n1ql client
2023/02/16 19:31:04 MultiScanCount = 3173 ExpectedMultiScanCount = 3173
2023/02/16 19:31:04 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:31:04 distinct = false
2023/02/16 19:31:05 Using n1ql client
2023/02/16 19:31:05 MultiScanCount = 418 ExpectedMultiScanCount = 418
2023/02/16 19:31:05 

--------- Simple Index with 1 field ---------
2023/02/16 19:31:05 
--- SingleIndexSimpleRange ---
2023/02/16 19:31:05 distinct = false
2023/02/16 19:31:05 Using n1ql client
2023/02/16 19:31:05 MultiScanCount = 2273 ExpectedMultiScanCount = 2273
2023/02/16 19:31:05 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:31:05 distinct = false
2023/02/16 19:31:05 Using n1ql client
2023/02/16 19:31:05 MultiScanCount = 7140 ExpectedMultiScanCount = 7140
2023/02/16 19:31:05 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:31:05 distinct = false
2023/02/16 19:31:06 Using n1ql client
2023/02/16 19:31:06 MultiScanCount = 8701 ExpectedMultiScanCount = 8701
2023/02/16 19:31:06 

--------- Composite Index with 3 fields ---------
2023/02/16 19:31:06 
--- ScanAllNoFilter ---
2023/02/16 19:31:06 distinct = false
2023/02/16 19:31:06 Using n1ql client
2023/02/16 19:31:06 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:31:06 
--- ScanAllFilterNil ---
2023/02/16 19:31:06 distinct = false
2023/02/16 19:31:07 Using n1ql client
2023/02/16 19:31:07 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:31:07 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:07 distinct = false
2023/02/16 19:31:07 Using n1ql client
2023/02/16 19:31:07 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:31:07 
--- 3FieldsSingleSeek ---
2023/02/16 19:31:07 distinct = false
2023/02/16 19:31:08 Using n1ql client
2023/02/16 19:31:08 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:31:08 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:31:08 distinct = false
2023/02/16 19:31:08 Using n1ql client
2023/02/16 19:31:08 MultiScanCount = 3 ExpectedMultiScanCount = 3
2023/02/16 19:31:08 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:31:08 distinct = false
2023/02/16 19:31:08 Using n1ql client
2023/02/16 19:31:08 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:31:08 

--------- New scenarios ---------
2023/02/16 19:31:08 
--- CompIndexHighUnbounded1 ---
2023/02/16 19:31:08 
--- Multi Scan 0 ---
2023/02/16 19:31:08 distinct = false
2023/02/16 19:31:09 Using n1ql client
2023/02/16 19:31:09 Using n1ql client
2023/02/16 19:31:09 len(scanResults) = 8 MultiScanCount = 8
2023/02/16 19:31:09 Expected and Actual scan responses are the same
2023/02/16 19:31:09 
--- Multi Scan 1 ---
2023/02/16 19:31:09 distinct = false
2023/02/16 19:31:09 Using n1ql client
2023/02/16 19:31:09 Using n1ql client
2023/02/16 19:31:09 len(scanResults) = 0 MultiScanCount = 0
2023/02/16 19:31:09 Expected and Actual scan responses are the same
2023/02/16 19:31:09 
--- Multi Scan 2 ---
2023/02/16 19:31:09 distinct = false
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 len(scanResults) = 9 MultiScanCount = 9
2023/02/16 19:31:10 Expected and Actual scan responses are the same
2023/02/16 19:31:10 
--- CompIndexHighUnbounded2 ---
2023/02/16 19:31:10 
--- Multi Scan 0 ---
2023/02/16 19:31:10 distinct = false
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 len(scanResults) = 4138 MultiScanCount = 4138
2023/02/16 19:31:10 Expected and Actual scan responses are the same
2023/02/16 19:31:10 
--- Multi Scan 1 ---
2023/02/16 19:31:10 distinct = false
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 Using n1ql client
2023/02/16 19:31:10 len(scanResults) = 2746 MultiScanCount = 2746
2023/02/16 19:31:10 Expected and Actual scan responses are the same
2023/02/16 19:31:10 
--- Multi Scan 2 ---
2023/02/16 19:31:10 distinct = false
2023/02/16 19:31:11 Using n1ql client
2023/02/16 19:31:11 Using n1ql client
2023/02/16 19:31:11 len(scanResults) = 4691 MultiScanCount = 4691
2023/02/16 19:31:11 Expected and Actual scan responses are the same
2023/02/16 19:31:11 
--- CompIndexHighUnbounded3 ---
2023/02/16 19:31:11 
--- Multi Scan 0 ---
2023/02/16 19:31:11 distinct = false
2023/02/16 19:31:11 Using n1ql client
2023/02/16 19:31:11 Using n1ql client
2023/02/16 19:31:11 len(scanResults) = 1329 MultiScanCount = 1329
2023/02/16 19:31:11 Expected and Actual scan responses are the same
2023/02/16 19:31:11 
--- CompIndexHighUnbounded4 ---
2023/02/16 19:31:11 
--- Multi Scan 0 ---
2023/02/16 19:31:11 distinct = false
2023/02/16 19:31:12 Using n1ql client
2023/02/16 19:31:12 Using n1ql client
2023/02/16 19:31:12 len(scanResults) = 5349 MultiScanCount = 5349
2023/02/16 19:31:12 Expected and Actual scan responses are the same
2023/02/16 19:31:12 
--- CompIndexHighUnbounded5 ---
2023/02/16 19:31:12 
--- Multi Scan 0 ---
2023/02/16 19:31:12 distinct = false
2023/02/16 19:31:12 Using n1ql client
2023/02/16 19:31:12 Using n1ql client
2023/02/16 19:31:12 len(scanResults) = 8210 MultiScanCount = 8210
2023/02/16 19:31:12 Expected and Actual scan responses are the same
2023/02/16 19:31:12 
--- SeekBoundaries ---
2023/02/16 19:31:12 
--- Multi Scan 0 ---
2023/02/16 19:31:12 distinct = false
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 len(scanResults) = 175 MultiScanCount = 175
2023/02/16 19:31:13 Expected and Actual scan responses are the same
2023/02/16 19:31:13 
--- Multi Scan 1 ---
2023/02/16 19:31:13 distinct = false
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 len(scanResults) = 1 MultiScanCount = 1
2023/02/16 19:31:13 Expected and Actual scan responses are the same
2023/02/16 19:31:13 
--- Multi Scan 2 ---
2023/02/16 19:31:13 distinct = false
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 Using n1ql client
2023/02/16 19:31:13 len(scanResults) = 555 MultiScanCount = 555
2023/02/16 19:31:13 Expected and Actual scan responses are the same
2023/02/16 19:31:13 
--- Multi Scan 3 ---
2023/02/16 19:31:13 distinct = false
2023/02/16 19:31:14 Using n1ql client
2023/02/16 19:31:14 Using n1ql client
2023/02/16 19:31:14 len(scanResults) = 872 MultiScanCount = 872
2023/02/16 19:31:14 Expected and Actual scan responses are the same
2023/02/16 19:31:14 
--- Multi Scan 4 ---
2023/02/16 19:31:14 distinct = false
2023/02/16 19:31:14 Using n1ql client
2023/02/16 19:31:14 Using n1ql client
2023/02/16 19:31:14 len(scanResults) = 287 MultiScanCount = 287
2023/02/16 19:31:14 Expected and Actual scan responses are the same
2023/02/16 19:31:14 
--- Multi Scan 5 ---
2023/02/16 19:31:14 distinct = false
2023/02/16 19:31:15 Using n1ql client
2023/02/16 19:31:15 Using n1ql client
2023/02/16 19:31:15 len(scanResults) = 5254 MultiScanCount = 5254
2023/02/16 19:31:15 Expected and Actual scan responses are the same
2023/02/16 19:31:15 
--- Multi Scan 6 ---
2023/02/16 19:31:15 distinct = false
2023/02/16 19:31:15 Using n1ql client
2023/02/16 19:31:15 Using n1ql client
2023/02/16 19:31:15 len(scanResults) = 5566 MultiScanCount = 5566
2023/02/16 19:31:15 Expected and Actual scan responses are the same
2023/02/16 19:31:15 
--- Multi Scan 7 ---
2023/02/16 19:31:15 distinct = false
2023/02/16 19:31:16 Using n1ql client
2023/02/16 19:31:16 Using n1ql client
2023/02/16 19:31:16 len(scanResults) = 8 MultiScanCount = 8
2023/02/16 19:31:16 Expected and Actual scan responses are the same
2023/02/16 19:31:16 

--------- With DISTINCT True ---------
2023/02/16 19:31:16 
--- ScanAllNoFilter ---
2023/02/16 19:31:16 distinct = true
2023/02/16 19:31:16 Using n1ql client
2023/02/16 19:31:16 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:16 
--- ScanAllFilterNil ---
2023/02/16 19:31:16 distinct = true
2023/02/16 19:31:16 Using n1ql client
2023/02/16 19:31:16 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:16 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:16 distinct = true
2023/02/16 19:31:17 Using n1ql client
2023/02/16 19:31:17 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:17 
--- SingleSeek ---
2023/02/16 19:31:17 distinct = true
2023/02/16 19:31:17 Using n1ql client
2023/02/16 19:31:17 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:31:17 
--- MultipleSeek ---
2023/02/16 19:31:17 distinct = true
2023/02/16 19:31:18 Using n1ql client
2023/02/16 19:31:18 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:31:18 
--- SimpleRange ---
2023/02/16 19:31:18 distinct = true
2023/02/16 19:31:18 Using n1ql client
2023/02/16 19:31:18 MultiScanCount = 227 ExpectedMultiScanCount = 227
2023/02/16 19:31:18 
--- NonOverlappingRanges ---
2023/02/16 19:31:18 distinct = true
2023/02/16 19:31:18 Using n1ql client
2023/02/16 19:31:18 MultiScanCount = 428 ExpectedMultiScanCount = 428
2023/02/16 19:31:18 
--- OverlappingRanges ---
2023/02/16 19:31:18 distinct = true
2023/02/16 19:31:19 Using n1ql client
2023/02/16 19:31:19 MultiScanCount = 575 ExpectedMultiScanCount = 575
2023/02/16 19:31:19 
--- NonOverlappingFilters ---
2023/02/16 19:31:19 distinct = true
2023/02/16 19:31:19 Using n1ql client
2023/02/16 19:31:19 MultiScanCount = 186 ExpectedMultiScanCount = 186
2023/02/16 19:31:19 
--- NonOverlappingFilters2 ---
2023/02/16 19:31:19 distinct = true
2023/02/16 19:31:20 Using n1ql client
2023/02/16 19:31:20 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:31:20 
--- OverlappingFilters ---
2023/02/16 19:31:20 distinct = true
2023/02/16 19:31:20 Using n1ql client
2023/02/16 19:31:20 MultiScanCount = 543 ExpectedMultiScanCount = 543
2023/02/16 19:31:20 
--- BoundaryFilters ---
2023/02/16 19:31:20 distinct = true
2023/02/16 19:31:20 Using n1ql client
2023/02/16 19:31:20 MultiScanCount = 172 ExpectedMultiScanCount = 172
2023/02/16 19:31:20 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:31:20 distinct = true
2023/02/16 19:31:21 Using n1ql client
2023/02/16 19:31:21 MultiScanCount = 135 ExpectedMultiScanCount = 135
2023/02/16 19:31:21 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:31:21 distinct = true
2023/02/16 19:31:21 Using n1ql client
2023/02/16 19:31:21 MultiScanCount = 134 ExpectedMultiScanCount = 134
2023/02/16 19:31:21 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:31:21 distinct = false
2023/02/16 19:31:22 Using n1ql client
2023/02/16 19:31:22 MultiScanCount = 5618 ExpectedMultiScanCount = 5618
2023/02/16 19:31:22 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:31:22 distinct = false
2023/02/16 19:31:22 Using n1ql client
2023/02/16 19:31:22 MultiScanCount = 3704 ExpectedMultiScanCount = 3704
2023/02/16 19:31:22 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:31:22 distinct = false
2023/02/16 19:31:22 Using n1ql client
2023/02/16 19:31:23 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:31:23 
--- FiltersWithUnbounded ---
2023/02/16 19:31:23 distinct = false
2023/02/16 19:31:23 Using n1ql client
2023/02/16 19:31:23 MultiScanCount = 3173 ExpectedMultiScanCount = 3173
2023/02/16 19:31:23 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:31:23 distinct = false
2023/02/16 19:31:23 Using n1ql client
2023/02/16 19:31:23 MultiScanCount = 418 ExpectedMultiScanCount = 418
2023/02/16 19:31:23 

--------- Simple Index with 1 field ---------
2023/02/16 19:31:23 
--- SingleIndexSimpleRange ---
2023/02/16 19:31:23 distinct = true
2023/02/16 19:31:24 Using n1ql client
2023/02/16 19:31:24 MultiScanCount = 227 ExpectedMultiScanCount = 227
2023/02/16 19:31:24 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:31:24 distinct = true
2023/02/16 19:31:24 Using n1ql client
2023/02/16 19:31:24 MultiScanCount = 713 ExpectedMultiScanCount = 713
2023/02/16 19:31:24 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:31:24 distinct = true
2023/02/16 19:31:24 Using n1ql client
2023/02/16 19:31:24 MultiScanCount = 869 ExpectedMultiScanCount = 869
2023/02/16 19:31:24 

--------- Composite Index with 3 fields ---------
2023/02/16 19:31:24 
--- ScanAllNoFilter ---
2023/02/16 19:31:24 distinct = true
2023/02/16 19:31:25 Using n1ql client
2023/02/16 19:31:25 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:25 
--- ScanAllFilterNil ---
2023/02/16 19:31:25 distinct = true
2023/02/16 19:31:25 Using n1ql client
2023/02/16 19:31:25 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:25 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:25 distinct = true
2023/02/16 19:31:26 Using n1ql client
2023/02/16 19:31:26 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:31:26 
--- 3FieldsSingleSeek ---
2023/02/16 19:31:26 distinct = true
2023/02/16 19:31:26 Using n1ql client
2023/02/16 19:31:26 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:31:26 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:31:26 distinct = true
2023/02/16 19:31:27 Using n1ql client
2023/02/16 19:31:27 MultiScanCount = 3 ExpectedMultiScanCount = 3
2023/02/16 19:31:27 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:31:27 distinct = true
2023/02/16 19:31:27 Using n1ql client
2023/02/16 19:31:27 MultiScanCount = 2 ExpectedMultiScanCount = 2
--- PASS: TestMultiScanCount (30.72s)
=== RUN   TestMultiScanScenarios
2023/02/16 19:31:27 In TestMultiScanScenarios()
2023/02/16 19:31:27 

--------- Composite Index with 2 fields ---------
2023/02/16 19:31:27 
--- ScanAllNoFilter ---
2023/02/16 19:31:27 distinct = false
2023/02/16 19:31:27 Using n1ql client
2023/02/16 19:31:28 Expected and Actual scan responses are the same
2023/02/16 19:31:28 
--- ScanAllFilterNil ---
2023/02/16 19:31:28 distinct = false
2023/02/16 19:31:28 Using n1ql client
2023/02/16 19:31:28 Expected and Actual scan responses are the same
2023/02/16 19:31:28 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:28 distinct = false
2023/02/16 19:31:28 Using n1ql client
2023/02/16 19:31:28 Expected and Actual scan responses are the same
2023/02/16 19:31:28 
--- SingleSeek ---
2023/02/16 19:31:28 distinct = false
2023/02/16 19:31:29 Using n1ql client
2023/02/16 19:31:29 Expected and Actual scan responses are the same
2023/02/16 19:31:29 
--- MultipleSeek ---
2023/02/16 19:31:29 distinct = false
2023/02/16 19:31:29 Using n1ql client
2023/02/16 19:31:29 Expected and Actual scan responses are the same
2023/02/16 19:31:29 
--- SimpleRange ---
2023/02/16 19:31:29 distinct = false
2023/02/16 19:31:30 Using n1ql client
2023/02/16 19:31:30 Expected and Actual scan responses are the same
2023/02/16 19:31:30 
--- NonOverlappingRanges ---
2023/02/16 19:31:30 distinct = false
2023/02/16 19:31:30 Using n1ql client
2023/02/16 19:31:30 Expected and Actual scan responses are the same
2023/02/16 19:31:30 
--- OverlappingRanges ---
2023/02/16 19:31:30 distinct = false
2023/02/16 19:31:30 Using n1ql client
2023/02/16 19:31:30 Expected and Actual scan responses are the same
2023/02/16 19:31:30 
--- NonOverlappingFilters ---
2023/02/16 19:31:30 distinct = false
2023/02/16 19:31:31 Using n1ql client
2023/02/16 19:31:31 Expected and Actual scan responses are the same
2023/02/16 19:31:31 
--- OverlappingFilters ---
2023/02/16 19:31:31 distinct = false
2023/02/16 19:31:31 Using n1ql client
2023/02/16 19:31:31 Expected and Actual scan responses are the same
2023/02/16 19:31:31 
--- BoundaryFilters ---
2023/02/16 19:31:31 distinct = false
2023/02/16 19:31:32 Using n1ql client
2023/02/16 19:31:32 Expected and Actual scan responses are the same
2023/02/16 19:31:32 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:31:32 distinct = false
2023/02/16 19:31:32 Using n1ql client
2023/02/16 19:31:32 Expected and Actual scan responses are the same
2023/02/16 19:31:32 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:31:32 distinct = false
2023/02/16 19:31:33 Using n1ql client
2023/02/16 19:31:33 Expected and Actual scan responses are the same
2023/02/16 19:31:33 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:31:33 distinct = false
2023/02/16 19:31:33 Using n1ql client
2023/02/16 19:31:33 Expected and Actual scan responses are the same
2023/02/16 19:31:33 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:31:33 distinct = false
2023/02/16 19:31:33 Using n1ql client
2023/02/16 19:31:33 Expected and Actual scan responses are the same
2023/02/16 19:31:33 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:31:33 distinct = false
2023/02/16 19:31:34 Using n1ql client
2023/02/16 19:31:34 Expected and Actual scan responses are the same
2023/02/16 19:31:34 
--- FiltersWithUnbounded ---
2023/02/16 19:31:34 distinct = false
2023/02/16 19:31:34 Using n1ql client
2023/02/16 19:31:34 Expected and Actual scan responses are the same
2023/02/16 19:31:34 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:31:34 distinct = false
2023/02/16 19:31:35 Using n1ql client
2023/02/16 19:31:35 Expected and Actual scan responses are the same
2023/02/16 19:31:35 

--------- Simple Index with 1 field ---------
2023/02/16 19:31:35 
--- SingleIndexSimpleRange ---
2023/02/16 19:31:35 distinct = false
2023/02/16 19:31:35 Using n1ql client
2023/02/16 19:31:35 Expected and Actual scan responses are the same
2023/02/16 19:31:35 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:31:35 distinct = false
2023/02/16 19:31:35 Using n1ql client
2023/02/16 19:31:35 Expected and Actual scan responses are the same
2023/02/16 19:31:35 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:31:35 distinct = false
2023/02/16 19:31:36 Using n1ql client
2023/02/16 19:31:36 Expected and Actual scan responses are the same
2023/02/16 19:31:36 

--------- Composite Index with 3 fields ---------
2023/02/16 19:31:36 
--- ScanAllNoFilter ---
2023/02/16 19:31:36 distinct = false
2023/02/16 19:31:36 Using n1ql client
2023/02/16 19:31:36 Expected and Actual scan responses are the same
2023/02/16 19:31:36 
--- ScanAllFilterNil ---
2023/02/16 19:31:36 distinct = false
2023/02/16 19:31:37 Using n1ql client
2023/02/16 19:31:37 Expected and Actual scan responses are the same
2023/02/16 19:31:37 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:37 distinct = false
2023/02/16 19:31:37 Using n1ql client
2023/02/16 19:31:37 Expected and Actual scan responses are the same
2023/02/16 19:31:37 
--- 3FieldsSingleSeek ---
2023/02/16 19:31:37 distinct = false
2023/02/16 19:31:38 Using n1ql client
2023/02/16 19:31:38 Expected and Actual scan responses are the same
2023/02/16 19:31:38 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:31:38 distinct = false
2023/02/16 19:31:38 Using n1ql client
2023/02/16 19:31:38 Expected and Actual scan responses are the same
2023/02/16 19:31:38 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:31:38 distinct = false
2023/02/16 19:31:39 Using n1ql client
2023/02/16 19:31:39 Expected and Actual scan responses are the same
2023/02/16 19:31:39 

--------- New scenarios ---------
2023/02/16 19:31:39 
--- CompIndexHighUnbounded1 ---
2023/02/16 19:31:39 
--- Multi Scan 0 ---
2023/02/16 19:31:39 distinct = false
2023/02/16 19:31:39 Using n1ql client
2023/02/16 19:31:39 Expected and Actual scan responses are the same
2023/02/16 19:31:39 
--- Multi Scan 1 ---
2023/02/16 19:31:39 distinct = false
2023/02/16 19:31:40 Using n1ql client
2023/02/16 19:31:40 Expected and Actual scan responses are the same
2023/02/16 19:31:40 
--- Multi Scan 2 ---
2023/02/16 19:31:40 distinct = false
2023/02/16 19:31:40 Using n1ql client
2023/02/16 19:31:40 Expected and Actual scan responses are the same
2023/02/16 19:31:40 
--- CompIndexHighUnbounded2 ---
2023/02/16 19:31:40 
--- Multi Scan 0 ---
2023/02/16 19:31:40 distinct = false
2023/02/16 19:31:40 Using n1ql client
2023/02/16 19:31:40 Expected and Actual scan responses are the same
2023/02/16 19:31:40 
--- Multi Scan 1 ---
2023/02/16 19:31:40 distinct = false
2023/02/16 19:31:41 Using n1ql client
2023/02/16 19:31:41 Expected and Actual scan responses are the same
2023/02/16 19:31:41 
--- Multi Scan 2 ---
2023/02/16 19:31:41 distinct = false
2023/02/16 19:31:41 Using n1ql client
2023/02/16 19:31:41 Expected and Actual scan responses are the same
2023/02/16 19:31:41 
--- CompIndexHighUnbounded3 ---
2023/02/16 19:31:41 
--- Multi Scan 0 ---
2023/02/16 19:31:41 distinct = false
2023/02/16 19:31:42 Using n1ql client
2023/02/16 19:31:42 Expected and Actual scan responses are the same
2023/02/16 19:31:42 
--- CompIndexHighUnbounded4 ---
2023/02/16 19:31:42 
--- Multi Scan 0 ---
2023/02/16 19:31:42 distinct = false
2023/02/16 19:31:42 Using n1ql client
2023/02/16 19:31:42 Expected and Actual scan responses are the same
2023/02/16 19:31:42 
--- CompIndexHighUnbounded5 ---
2023/02/16 19:31:42 
--- Multi Scan 0 ---
2023/02/16 19:31:42 distinct = false
2023/02/16 19:31:43 Using n1ql client
2023/02/16 19:31:43 Expected and Actual scan responses are the same
2023/02/16 19:31:43 
--- SeekBoundaries ---
2023/02/16 19:31:43 
--- Multi Scan 0 ---
2023/02/16 19:31:43 distinct = false
2023/02/16 19:31:43 Using n1ql client
2023/02/16 19:31:43 Expected and Actual scan responses are the same
2023/02/16 19:31:43 
--- Multi Scan 1 ---
2023/02/16 19:31:43 distinct = false
2023/02/16 19:31:43 Using n1ql client
2023/02/16 19:31:43 Expected and Actual scan responses are the same
2023/02/16 19:31:43 
--- Multi Scan 2 ---
2023/02/16 19:31:43 distinct = false
2023/02/16 19:31:44 Using n1ql client
2023/02/16 19:31:44 Expected and Actual scan responses are the same
2023/02/16 19:31:44 
--- Multi Scan 3 ---
2023/02/16 19:31:44 distinct = false
2023/02/16 19:31:44 Using n1ql client
2023/02/16 19:31:44 Expected and Actual scan responses are the same
2023/02/16 19:31:44 
--- Multi Scan 4 ---
2023/02/16 19:31:44 distinct = false
2023/02/16 19:31:45 Using n1ql client
2023/02/16 19:31:45 Expected and Actual scan responses are the same
2023/02/16 19:31:45 
--- Multi Scan 5 ---
2023/02/16 19:31:45 distinct = false
2023/02/16 19:31:45 Using n1ql client
2023/02/16 19:31:45 Expected and Actual scan responses are the same
2023/02/16 19:31:45 
--- Multi Scan 6 ---
2023/02/16 19:31:45 distinct = false
2023/02/16 19:31:45 Using n1ql client
2023/02/16 19:31:45 Expected and Actual scan responses are the same
2023/02/16 19:31:45 
--- Multi Scan 7 ---
2023/02/16 19:31:45 distinct = false
2023/02/16 19:31:46 Using n1ql client
2023/02/16 19:31:46 Expected and Actual scan responses are the same
2023/02/16 19:31:46 
--- PrefixSortVariations ---
2023/02/16 19:31:46 
--- Multi Scan 0 ---
2023/02/16 19:31:46 distinct = false
2023/02/16 19:31:46 Using n1ql client
2023/02/16 19:31:46 Expected and Actual scan responses are the same
2023/02/16 19:31:46 
--- Multi Scan 1 ---
2023/02/16 19:31:46 distinct = false
2023/02/16 19:31:47 Using n1ql client
2023/02/16 19:31:47 Expected and Actual scan responses are the same
--- PASS: TestMultiScanScenarios (19.66s)
=== RUN   TestMultiScanOffset
2023/02/16 19:31:47 In TestMultiScanOffset()
2023/02/16 19:31:47 

--------- Composite Index with 2 fields ---------
2023/02/16 19:31:47 
--- ScanAllNoFilter ---
2023/02/16 19:31:47 distinct = false
2023/02/16 19:31:47 Using n1ql client
2023/02/16 19:31:47 
--- ScanAllFilterNil ---
2023/02/16 19:31:47 distinct = false
2023/02/16 19:31:47 Using n1ql client
2023/02/16 19:31:48 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:48 distinct = false
2023/02/16 19:31:48 Using n1ql client
2023/02/16 19:31:48 
--- SingleSeek ---
2023/02/16 19:31:48 distinct = false
2023/02/16 19:31:48 Using n1ql client
2023/02/16 19:31:48 
--- MultipleSeek ---
2023/02/16 19:31:48 distinct = false
2023/02/16 19:31:49 Using n1ql client
2023/02/16 19:31:49 
--- SimpleRange ---
2023/02/16 19:31:49 distinct = false
2023/02/16 19:31:49 Using n1ql client
2023/02/16 19:31:49 
--- NonOverlappingRanges ---
2023/02/16 19:31:49 distinct = false
2023/02/16 19:31:49 Using n1ql client
2023/02/16 19:31:49 
--- OverlappingRanges ---
2023/02/16 19:31:49 distinct = false
2023/02/16 19:31:50 Using n1ql client
2023/02/16 19:31:50 
--- NonOverlappingFilters ---
2023/02/16 19:31:50 distinct = false
2023/02/16 19:31:50 Using n1ql client
2023/02/16 19:31:50 
--- OverlappingFilters ---
2023/02/16 19:31:50 distinct = false
2023/02/16 19:31:51 Using n1ql client
2023/02/16 19:31:51 
--- BoundaryFilters ---
2023/02/16 19:31:51 distinct = false
2023/02/16 19:31:51 Using n1ql client
2023/02/16 19:31:51 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:31:51 distinct = false
2023/02/16 19:31:52 Using n1ql client
2023/02/16 19:31:52 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:31:52 distinct = false
2023/02/16 19:31:52 Using n1ql client
2023/02/16 19:31:52 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:31:52 distinct = false
2023/02/16 19:31:52 Using n1ql client
2023/02/16 19:31:52 Expected and Actual scan responses are the same
2023/02/16 19:31:52 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:31:52 distinct = false
2023/02/16 19:31:53 Using n1ql client
2023/02/16 19:31:53 Expected and Actual scan responses are the same
2023/02/16 19:31:53 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:31:53 distinct = false
2023/02/16 19:31:53 Using n1ql client
2023/02/16 19:31:53 Expected and Actual scan responses are the same
2023/02/16 19:31:53 
--- FiltersWithUnbounded ---
2023/02/16 19:31:53 distinct = false
2023/02/16 19:31:54 Using n1ql client
2023/02/16 19:31:54 Expected and Actual scan responses are the same
2023/02/16 19:31:54 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:31:54 distinct = false
2023/02/16 19:31:54 Using n1ql client
2023/02/16 19:31:54 Expected and Actual scan responses are the same
2023/02/16 19:31:54 

--------- Simple Index with 1 field ---------
2023/02/16 19:31:54 
--- SingleIndexSimpleRange ---
2023/02/16 19:31:54 distinct = false
2023/02/16 19:31:54 Using n1ql client
2023/02/16 19:31:54 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:31:54 distinct = false
2023/02/16 19:31:55 Using n1ql client
2023/02/16 19:31:55 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:31:55 distinct = false
2023/02/16 19:31:55 Using n1ql client
2023/02/16 19:31:55 

--------- Composite Index with 3 fields ---------
2023/02/16 19:31:55 
--- ScanAllNoFilter ---
2023/02/16 19:31:55 distinct = false
2023/02/16 19:31:56 Using n1ql client
2023/02/16 19:31:56 
--- ScanAllFilterNil ---
2023/02/16 19:31:56 distinct = false
2023/02/16 19:31:56 Using n1ql client
2023/02/16 19:31:56 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:31:56 distinct = false
2023/02/16 19:31:57 Using n1ql client
2023/02/16 19:31:57 
--- 3FieldsSingleSeek ---
2023/02/16 19:31:57 distinct = false
2023/02/16 19:31:57 Using n1ql client
2023/02/16 19:31:57 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:31:57 distinct = false
2023/02/16 19:31:58 Using n1ql client
2023/02/16 19:31:58 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:31:58 distinct = false
2023/02/16 19:31:58 Using n1ql client
--- PASS: TestMultiScanOffset (11.25s)
=== RUN   TestMultiScanPrimaryIndex
2023/02/16 19:31:58 In TestMultiScanPrimaryIndex()
2023/02/16 19:31:58 
--- PrimaryRange ---
2023/02/16 19:31:58 Using n1ql client
2023/02/16 19:31:58 Expected and Actual scan responses are the same
2023/02/16 19:31:58 
--- PrimaryScanAllNoFilter ---
2023/02/16 19:31:58 Using n1ql client
2023/02/16 19:31:58 Expected and Actual scan responses are the same
--- PASS: TestMultiScanPrimaryIndex (0.14s)
=== RUN   TestMultiScanDistinct
2023/02/16 19:31:58 In TestScansDistinct()
2023/02/16 19:31:58 

--------- Composite Index with 2 fields ---------
2023/02/16 19:31:58 
--- ScanAllNoFilter ---
2023/02/16 19:31:58 distinct = true
2023/02/16 19:31:58 Using n1ql client
2023/02/16 19:31:59 Expected and Actual scan responses are the same
2023/02/16 19:31:59 
--- ScanAllFilterNil ---
2023/02/16 19:31:59 distinct = true
2023/02/16 19:31:59 Using n1ql client
2023/02/16 19:32:00 Expected and Actual scan responses are the same
2023/02/16 19:32:00 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:32:00 distinct = true
2023/02/16 19:32:00 Using n1ql client
2023/02/16 19:32:00 Expected and Actual scan responses are the same
2023/02/16 19:32:00 
--- SingleSeek ---
2023/02/16 19:32:00 distinct = true
2023/02/16 19:32:01 Using n1ql client
2023/02/16 19:32:01 Expected and Actual scan responses are the same
2023/02/16 19:32:01 
--- MultipleSeek ---
2023/02/16 19:32:01 distinct = true
2023/02/16 19:32:01 Using n1ql client
2023/02/16 19:32:01 Expected and Actual scan responses are the same
2023/02/16 19:32:01 
--- SimpleRange ---
2023/02/16 19:32:01 distinct = true
2023/02/16 19:32:02 Using n1ql client
2023/02/16 19:32:02 Expected and Actual scan responses are the same
2023/02/16 19:32:02 
--- NonOverlappingRanges ---
2023/02/16 19:32:02 distinct = true
2023/02/16 19:32:02 Using n1ql client
2023/02/16 19:32:02 Expected and Actual scan responses are the same
2023/02/16 19:32:02 
--- OverlappingRanges ---
2023/02/16 19:32:02 distinct = true
2023/02/16 19:32:03 Using n1ql client
2023/02/16 19:32:03 Expected and Actual scan responses are the same
2023/02/16 19:32:03 
--- NonOverlappingFilters ---
2023/02/16 19:32:03 distinct = true
2023/02/16 19:32:03 Using n1ql client
2023/02/16 19:32:03 Expected and Actual scan responses are the same
2023/02/16 19:32:03 
--- OverlappingFilters ---
2023/02/16 19:32:03 distinct = true
2023/02/16 19:32:03 Using n1ql client
2023/02/16 19:32:03 Expected and Actual scan responses are the same
2023/02/16 19:32:03 
--- BoundaryFilters ---
2023/02/16 19:32:03 distinct = true
2023/02/16 19:32:04 Using n1ql client
2023/02/16 19:32:04 Expected and Actual scan responses are the same
2023/02/16 19:32:04 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:32:04 distinct = true
2023/02/16 19:32:04 Using n1ql client
2023/02/16 19:32:04 Expected and Actual scan responses are the same
2023/02/16 19:32:04 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:32:04 distinct = true
2023/02/16 19:32:05 Using n1ql client
2023/02/16 19:32:05 Expected and Actual scan responses are the same
2023/02/16 19:32:05 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:32:05 distinct = false
2023/02/16 19:32:05 Using n1ql client
2023/02/16 19:32:05 Expected and Actual scan responses are the same
2023/02/16 19:32:05 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:32:05 distinct = false
2023/02/16 19:32:06 Using n1ql client
2023/02/16 19:32:06 Expected and Actual scan responses are the same
2023/02/16 19:32:06 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:32:06 distinct = false
2023/02/16 19:32:06 Using n1ql client
2023/02/16 19:32:06 Expected and Actual scan responses are the same
2023/02/16 19:32:06 
--- FiltersWithUnbounded ---
2023/02/16 19:32:06 distinct = false
2023/02/16 19:32:06 Using n1ql client
2023/02/16 19:32:06 Expected and Actual scan responses are the same
2023/02/16 19:32:06 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:32:06 distinct = false
2023/02/16 19:32:07 Using n1ql client
2023/02/16 19:32:07 Expected and Actual scan responses are the same
2023/02/16 19:32:07 

--------- Simple Index with 1 field ---------
2023/02/16 19:32:07 
--- SingleIndexSimpleRange ---
2023/02/16 19:32:07 distinct = true
2023/02/16 19:32:07 Using n1ql client
2023/02/16 19:32:07 Expected and Actual scan responses are the same
2023/02/16 19:32:07 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:32:07 distinct = true
2023/02/16 19:32:07 Using n1ql client
2023/02/16 19:32:08 Expected and Actual scan responses are the same
2023/02/16 19:32:08 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:32:08 distinct = true
2023/02/16 19:32:08 Using n1ql client
2023/02/16 19:32:08 Expected and Actual scan responses are the same
2023/02/16 19:32:08 

--------- Composite Index with 3 fields ---------
2023/02/16 19:32:08 
--- ScanAllNoFilter ---
2023/02/16 19:32:08 distinct = true
2023/02/16 19:32:08 Using n1ql client
2023/02/16 19:32:08 Expected and Actual scan responses are the same
2023/02/16 19:32:08 
--- ScanAllFilterNil ---
2023/02/16 19:32:08 distinct = true
2023/02/16 19:32:09 Using n1ql client
2023/02/16 19:32:09 Expected and Actual scan responses are the same
2023/02/16 19:32:09 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:32:09 distinct = true
2023/02/16 19:32:09 Using n1ql client
2023/02/16 19:32:10 Expected and Actual scan responses are the same
2023/02/16 19:32:10 
--- 3FieldsSingleSeek ---
2023/02/16 19:32:10 distinct = true
2023/02/16 19:32:10 Using n1ql client
2023/02/16 19:32:10 Expected and Actual scan responses are the same
2023/02/16 19:32:10 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:32:10 distinct = true
2023/02/16 19:32:10 Using n1ql client
2023/02/16 19:32:10 Expected and Actual scan responses are the same
2023/02/16 19:32:10 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:32:10 distinct = true
2023/02/16 19:32:11 Using n1ql client
2023/02/16 19:32:11 Expected and Actual scan responses are the same
--- PASS: TestMultiScanDistinct (12.76s)
=== RUN   TestMultiScanProjection
2023/02/16 19:32:11 In TestMultiScanProjection()
2023/02/16 19:32:11 

--------- Composite Index with 2 fields ---------
2023/02/16 19:32:11 
--- ScanAllNoFilter ---
2023/02/16 19:32:11 distinct = true
2023/02/16 19:32:11 Using n1ql client
2023/02/16 19:32:11 Expected and Actual scan responses are the same
2023/02/16 19:32:11 
--- ScanAllFilterNil ---
2023/02/16 19:32:11 distinct = true
2023/02/16 19:32:12 Using n1ql client
2023/02/16 19:32:12 Expected and Actual scan responses are the same
2023/02/16 19:32:12 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:32:12 distinct = true
2023/02/16 19:32:12 Using n1ql client
2023/02/16 19:32:12 Expected and Actual scan responses are the same
2023/02/16 19:32:12 
--- SingleSeek ---
2023/02/16 19:32:12 distinct = true
2023/02/16 19:32:13 Using n1ql client
2023/02/16 19:32:13 Expected and Actual scan responses are the same
2023/02/16 19:32:13 
--- MultipleSeek ---
2023/02/16 19:32:13 distinct = true
2023/02/16 19:32:13 Using n1ql client
2023/02/16 19:32:13 Expected and Actual scan responses are the same
2023/02/16 19:32:13 
--- SimpleRange ---
2023/02/16 19:32:13 distinct = true
2023/02/16 19:32:13 Using n1ql client
2023/02/16 19:32:13 Expected and Actual scan responses are the same
2023/02/16 19:32:13 
--- NonOverlappingRanges ---
2023/02/16 19:32:13 distinct = true
2023/02/16 19:32:14 Using n1ql client
2023/02/16 19:32:14 Expected and Actual scan responses are the same
2023/02/16 19:32:14 
--- OverlappingRanges ---
2023/02/16 19:32:14 distinct = true
2023/02/16 19:32:14 Using n1ql client
2023/02/16 19:32:14 Expected and Actual scan responses are the same
2023/02/16 19:32:14 
--- NonOverlappingFilters ---
2023/02/16 19:32:14 distinct = true
2023/02/16 19:32:15 Using n1ql client
2023/02/16 19:32:15 Expected and Actual scan responses are the same
2023/02/16 19:32:15 
--- OverlappingFilters ---
2023/02/16 19:32:15 distinct = true
2023/02/16 19:32:15 Using n1ql client
2023/02/16 19:32:15 Expected and Actual scan responses are the same
2023/02/16 19:32:15 
--- BoundaryFilters ---
2023/02/16 19:32:15 distinct = true
2023/02/16 19:32:16 Using n1ql client
2023/02/16 19:32:16 Expected and Actual scan responses are the same
2023/02/16 19:32:16 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:32:16 distinct = true
2023/02/16 19:32:16 Using n1ql client
2023/02/16 19:32:16 Expected and Actual scan responses are the same
2023/02/16 19:32:16 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:32:16 distinct = true
2023/02/16 19:32:16 Using n1ql client
2023/02/16 19:32:16 Expected and Actual scan responses are the same
2023/02/16 19:32:16 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:32:16 distinct = false
2023/02/16 19:32:17 Using n1ql client
2023/02/16 19:32:17 Expected and Actual scan responses are the same
2023/02/16 19:32:17 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:32:17 distinct = false
2023/02/16 19:32:17 Using n1ql client
2023/02/16 19:32:17 Expected and Actual scan responses are the same
2023/02/16 19:32:17 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:32:17 distinct = false
2023/02/16 19:32:18 Using n1ql client
2023/02/16 19:32:18 Expected and Actual scan responses are the same
2023/02/16 19:32:18 
--- FiltersWithUnbounded ---
2023/02/16 19:32:18 distinct = false
2023/02/16 19:32:18 Using n1ql client
2023/02/16 19:32:18 Expected and Actual scan responses are the same
2023/02/16 19:32:18 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:32:18 distinct = false
2023/02/16 19:32:19 Using n1ql client
2023/02/16 19:32:19 Expected and Actual scan responses are the same
2023/02/16 19:32:19 

--------- Simple Index with 1 field ---------
2023/02/16 19:32:19 
--- SingleIndexSimpleRange ---
2023/02/16 19:32:19 distinct = true
2023/02/16 19:32:19 Using n1ql client
2023/02/16 19:32:19 Expected and Actual scan responses are the same
2023/02/16 19:32:19 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:32:19 distinct = true
2023/02/16 19:32:19 Using n1ql client
2023/02/16 19:32:19 Expected and Actual scan responses are the same
2023/02/16 19:32:19 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:32:19 distinct = true
2023/02/16 19:32:20 Using n1ql client
2023/02/16 19:32:20 Expected and Actual scan responses are the same
2023/02/16 19:32:20 

--------- Composite Index with 3 fields ---------
2023/02/16 19:32:20 
--- ScanAllNoFilter ---
2023/02/16 19:32:20 distinct = true
2023/02/16 19:32:20 Using n1ql client
2023/02/16 19:32:20 Expected and Actual scan responses are the same
2023/02/16 19:32:20 
--- ScanAllFilterNil ---
2023/02/16 19:32:20 distinct = true
2023/02/16 19:32:21 Using n1ql client
2023/02/16 19:32:21 Expected and Actual scan responses are the same
2023/02/16 19:32:21 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:32:21 distinct = true
2023/02/16 19:32:21 Using n1ql client
2023/02/16 19:32:21 Expected and Actual scan responses are the same
2023/02/16 19:32:21 
--- 3FieldsSingleSeek ---
2023/02/16 19:32:21 distinct = true
2023/02/16 19:32:22 Using n1ql client
2023/02/16 19:32:22 Expected and Actual scan responses are the same
2023/02/16 19:32:22 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:32:22 distinct = true
2023/02/16 19:32:22 Using n1ql client
2023/02/16 19:32:22 Expected and Actual scan responses are the same
2023/02/16 19:32:22 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:32:22 distinct = true
2023/02/16 19:32:23 Using n1ql client
2023/02/16 19:32:23 Expected and Actual scan responses are the same
2023/02/16 19:32:23 indexes are: index_company, index_companyname, index_company_name_age, index_company_name_age_address, index_company_name_age_address_friends
2023/02/16 19:32:23 fields are: [company], [company name], [company name age], [company name age address], [company name age address friends]
2023/02/16 19:32:23 
--- SingleIndexProjectFirst ---
2023/02/16 19:32:23 distinct = true
2023/02/16 19:32:23 Using n1ql client
2023/02/16 19:32:23 Expected and Actual scan responses are the same
2023/02/16 19:32:23 
--- 2FieldIndexProjectSecond ---
2023/02/16 19:32:23 distinct = true
2023/02/16 19:32:23 Using n1ql client
2023/02/16 19:32:23 Expected and Actual scan responses are the same
2023/02/16 19:32:23 
--- 3FieldIndexProjectThird ---
2023/02/16 19:32:23 distinct = true
2023/02/16 19:32:24 Using n1ql client
2023/02/16 19:32:24 Expected and Actual scan responses are the same
2023/02/16 19:32:24 
--- 4FieldIndexProjectFourth ---
2023/02/16 19:32:24 distinct = true
2023/02/16 19:32:26 Using n1ql client
2023/02/16 19:32:26 Expected and Actual scan responses are the same
2023/02/16 19:32:26 
--- 5FieldIndexProjectFifth ---
2023/02/16 19:32:26 distinct = true
2023/02/16 19:32:29 Using n1ql client
2023/02/16 19:32:29 Expected and Actual scan responses are the same
2023/02/16 19:32:29 
--- 2FieldIndexProjectTwo ---
2023/02/16 19:32:29 distinct = true
2023/02/16 19:32:29 Using n1ql client
2023/02/16 19:32:29 Expected and Actual scan responses are the same
2023/02/16 19:32:29 
--- 3FieldIndexProjectTwo ---
2023/02/16 19:32:29 distinct = true
2023/02/16 19:32:30 Using n1ql client
2023/02/16 19:32:30 Expected and Actual scan responses are the same
2023/02/16 19:32:30 
--- 3FieldIndexProjectTwo ---
2023/02/16 19:32:30 distinct = true
2023/02/16 19:32:30 Using n1ql client
2023/02/16 19:32:30 Expected and Actual scan responses are the same
2023/02/16 19:32:30 
--- 3FieldIndexProjectTwo ---
2023/02/16 19:32:30 distinct = true
2023/02/16 19:32:31 Using n1ql client
2023/02/16 19:32:31 Expected and Actual scan responses are the same
2023/02/16 19:32:31 
--- 4FieldIndexProjectTwo ---
2023/02/16 19:32:31 distinct = true
2023/02/16 19:32:32 Using n1ql client
2023/02/16 19:32:32 Expected and Actual scan responses are the same
2023/02/16 19:32:32 
--- 4FieldIndexProjectTwo ---
2023/02/16 19:32:32 distinct = true
2023/02/16 19:32:34 Using n1ql client
2023/02/16 19:32:34 Expected and Actual scan responses are the same
2023/02/16 19:32:34 
--- 4FieldIndexProjectTwo ---
2023/02/16 19:32:34 distinct = true
2023/02/16 19:32:36 Using n1ql client
2023/02/16 19:32:36 Expected and Actual scan responses are the same
2023/02/16 19:32:36 
--- 4FieldIndexProjectTwo ---
2023/02/16 19:32:36 distinct = true
2023/02/16 19:32:37 Using n1ql client
2023/02/16 19:32:37 Expected and Actual scan responses are the same
2023/02/16 19:32:37 
--- 4FieldIndexProjectTwo ---
2023/02/16 19:32:37 distinct = true
2023/02/16 19:32:39 Using n1ql client
2023/02/16 19:32:39 Expected and Actual scan responses are the same
2023/02/16 19:32:39 
--- 5FieldIndexProjectTwo ---
2023/02/16 19:32:39 distinct = true
2023/02/16 19:32:42 Using n1ql client
2023/02/16 19:32:42 Expected and Actual scan responses are the same
2023/02/16 19:32:42 
--- 5FieldIndexProjectTwo ---
2023/02/16 19:32:42 distinct = true
2023/02/16 19:32:46 Using n1ql client
2023/02/16 19:32:46 Expected and Actual scan responses are the same
2023/02/16 19:32:46 
--- 5FieldIndexProjectTwo ---
2023/02/16 19:32:46 distinct = true
2023/02/16 19:32:49 Using n1ql client
2023/02/16 19:32:49 Expected and Actual scan responses are the same
2023/02/16 19:32:49 
--- 5FieldIndexProjectTwo ---
2023/02/16 19:32:49 distinct = true
2023/02/16 19:32:53 Using n1ql client
2023/02/16 19:32:53 Expected and Actual scan responses are the same
2023/02/16 19:32:53 
--- 5FieldIndexProjectThree ---
2023/02/16 19:32:53 distinct = true
2023/02/16 19:32:56 Using n1ql client
2023/02/16 19:32:56 Expected and Actual scan responses are the same
2023/02/16 19:32:56 
--- 5FieldIndexProjectFour ---
2023/02/16 19:32:56 distinct = true
2023/02/16 19:33:00 Using n1ql client
2023/02/16 19:33:00 Expected and Actual scan responses are the same
2023/02/16 19:33:00 
--- 5FieldIndexProjectAll ---
2023/02/16 19:33:00 distinct = true
2023/02/16 19:33:04 Using n1ql client
2023/02/16 19:33:04 Expected and Actual scan responses are the same
2023/02/16 19:33:04 
--- 5FieldIndexProjectAlternate ---
2023/02/16 19:33:04 distinct = true
2023/02/16 19:33:07 Using n1ql client
2023/02/16 19:33:08 Expected and Actual scan responses are the same
2023/02/16 19:33:08 
--- 5FieldIndexProjectEmptyEntryKeys ---
2023/02/16 19:33:08 distinct = true
2023/02/16 19:33:11 Using n1ql client
2023/02/16 19:33:11 Expected and Actual scan responses are the same
--- PASS: TestMultiScanProjection (60.03s)
=== RUN   TestMultiScanRestAPI
2023/02/16 19:33:11 In TestMultiScanRestAPI()
2023/02/16 19:33:11 In DropAllSecondaryIndexes()
2023/02/16 19:33:11 Index found:  index_primary
2023/02/16 19:33:11 Dropped index index_primary
2023/02/16 19:33:11 Index found:  index_company_name_age_address_friends
2023/02/16 19:33:11 Dropped index index_company_name_age_address_friends
2023/02/16 19:33:11 Index found:  index_company_name_age_address
2023/02/16 19:33:11 Dropped index index_company_name_age_address
2023/02/16 19:33:11 Index found:  addressidx
2023/02/16 19:33:11 Dropped index addressidx
2023/02/16 19:33:11 Index found:  index_companyname
2023/02/16 19:33:11 Dropped index index_companyname
2023/02/16 19:33:11 Index found:  index_company
2023/02/16 19:33:11 Dropped index index_company
2023/02/16 19:33:11 Index found:  index_company_name_age
2023/02/16 19:33:11 Dropped index index_company_name_age
2023/02/16 19:33:15 Created the secondary index index_companyname. Waiting for it become active
2023/02/16 19:33:15 Index is 12033513278422605885 now active
2023/02/16 19:33:15 GET all indexes
2023/02/16 19:33:15 200 OK
2023/02/16 19:33:15 getscans status : 200 OK
2023/02/16 19:33:15 number of entries 337
2023/02/16 19:33:15 Status : 200 OK
2023/02/16 19:33:15 Result from multiscancount API = 0
--- PASS: TestMultiScanRestAPI (4.46s)
=== RUN   TestMultiScanPrimaryIndexVariations
2023/02/16 19:33:15 In TestMultiScanPrimaryIndexVariations()
2023/02/16 19:33:22 Created the secondary index index_pi. Waiting for it become active
2023/02/16 19:33:22 Index is 7802491249755038528 now active
2023/02/16 19:33:22 
--- No Overlap ---
2023/02/16 19:33:22 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Proper Overlap ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Low Boundary Overlap ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Complex Overlaps ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Multiple Equal Overlaps ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Boundary and Subset Overlaps ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Point Overlaps ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Boundary and Point Overlaps ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 
--- Primary index range null ---
2023/02/16 19:33:23 Using n1ql client
2023/02/16 19:33:23 Expected and Actual scan responses are the same
2023/02/16 19:33:23 Dropping the secondary index index_pi
2023/02/16 19:33:23 Index dropped
--- PASS: TestMultiScanPrimaryIndexVariations (8.05s)
=== RUN   TestMultiScanDescSetup
2023/02/16 19:33:23 In TestMultiScanDescSetup()
2023/02/16 19:33:23 In DropAllSecondaryIndexes()
2023/02/16 19:33:23 Index found:  index_companyname
2023/02/16 19:33:23 Dropped index index_companyname
2023/02/16 19:33:28 Created the secondary index index_companyname_desc. Waiting for it become active
2023/02/16 19:33:28 Index is 4667276839370261139 now active
2023/02/16 19:33:36 Created the secondary index index_company_desc. Waiting for it become active
2023/02/16 19:33:36 Index is 8012239010069516560 now active
2023/02/16 19:33:42 Created the secondary index index_company_name_age_desc. Waiting for it become active
2023/02/16 19:33:42 Index is 10670660689345607080 now active
--- PASS: TestMultiScanDescSetup (18.77s)
=== RUN   TestMultiScanDescScenarios
2023/02/16 19:33:42 In TestMultiScanDescScenarios()
2023/02/16 19:33:42 

--------- Composite Index with 2 fields ---------
2023/02/16 19:33:42 
--- ScanAllNoFilter ---
2023/02/16 19:33:42 distinct = false
2023/02/16 19:33:43 Using n1ql client
2023/02/16 19:33:43 Expected and Actual scan responses are the same
2023/02/16 19:33:43 
--- ScanAllFilterNil ---
2023/02/16 19:33:43 distinct = false
2023/02/16 19:33:43 Using n1ql client
2023/02/16 19:33:43 Expected and Actual scan responses are the same
2023/02/16 19:33:43 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:33:43 distinct = false
2023/02/16 19:33:44 Using n1ql client
2023/02/16 19:33:44 Expected and Actual scan responses are the same
2023/02/16 19:33:44 
--- SingleSeek ---
2023/02/16 19:33:44 distinct = false
2023/02/16 19:33:44 Using n1ql client
2023/02/16 19:33:44 Expected and Actual scan responses are the same
2023/02/16 19:33:44 
--- MultipleSeek ---
2023/02/16 19:33:44 distinct = false
2023/02/16 19:33:44 Using n1ql client
2023/02/16 19:33:44 Expected and Actual scan responses are the same
2023/02/16 19:33:44 
--- SimpleRange ---
2023/02/16 19:33:44 distinct = false
2023/02/16 19:33:45 Using n1ql client
2023/02/16 19:33:45 Expected and Actual scan responses are the same
2023/02/16 19:33:45 
--- NonOverlappingRanges ---
2023/02/16 19:33:45 distinct = false
2023/02/16 19:33:45 Using n1ql client
2023/02/16 19:33:45 Expected and Actual scan responses are the same
2023/02/16 19:33:45 
--- OverlappingRanges ---
2023/02/16 19:33:45 distinct = false
2023/02/16 19:33:46 Using n1ql client
2023/02/16 19:33:46 Expected and Actual scan responses are the same
2023/02/16 19:33:46 
--- NonOverlappingFilters ---
2023/02/16 19:33:46 distinct = false
2023/02/16 19:33:46 Using n1ql client
2023/02/16 19:33:46 Expected and Actual scan responses are the same
2023/02/16 19:33:46 
--- OverlappingFilters ---
2023/02/16 19:33:46 distinct = false
2023/02/16 19:33:47 Using n1ql client
2023/02/16 19:33:47 Expected and Actual scan responses are the same
2023/02/16 19:33:47 
--- BoundaryFilters ---
2023/02/16 19:33:47 distinct = false
2023/02/16 19:33:47 Using n1ql client
2023/02/16 19:33:47 Expected and Actual scan responses are the same
2023/02/16 19:33:47 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:33:47 distinct = false
2023/02/16 19:33:47 Using n1ql client
2023/02/16 19:33:47 Expected and Actual scan responses are the same
2023/02/16 19:33:47 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:33:47 distinct = false
2023/02/16 19:33:48 Using n1ql client
2023/02/16 19:33:48 Expected and Actual scan responses are the same
2023/02/16 19:33:48 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:33:48 distinct = false
2023/02/16 19:33:48 Using n1ql client
2023/02/16 19:33:48 Expected and Actual scan responses are the same
2023/02/16 19:33:48 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:33:48 distinct = false
2023/02/16 19:33:49 Using n1ql client
2023/02/16 19:33:49 Expected and Actual scan responses are the same
2023/02/16 19:33:49 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:33:49 distinct = false
2023/02/16 19:33:49 Using n1ql client
2023/02/16 19:33:49 Expected and Actual scan responses are the same
2023/02/16 19:33:49 
--- FiltersWithUnbounded ---
2023/02/16 19:33:49 distinct = false
2023/02/16 19:33:50 Using n1ql client
2023/02/16 19:33:50 Expected and Actual scan responses are the same
2023/02/16 19:33:50 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:33:50 distinct = false
2023/02/16 19:33:50 Using n1ql client
2023/02/16 19:33:50 Expected and Actual scan responses are the same
2023/02/16 19:33:50 

--------- Simple Index with 1 field ---------
2023/02/16 19:33:50 
--- SingleIndexSimpleRange ---
2023/02/16 19:33:50 distinct = false
2023/02/16 19:33:50 Using n1ql client
2023/02/16 19:33:50 Expected and Actual scan responses are the same
2023/02/16 19:33:50 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:33:50 distinct = false
2023/02/16 19:33:51 Using n1ql client
2023/02/16 19:33:51 Expected and Actual scan responses are the same
2023/02/16 19:33:51 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:33:51 distinct = false
2023/02/16 19:33:51 Using n1ql client
2023/02/16 19:33:51 Expected and Actual scan responses are the same
2023/02/16 19:33:51 

--------- Composite Index with 3 fields ---------
2023/02/16 19:33:51 
--- ScanAllNoFilter ---
2023/02/16 19:33:51 distinct = false
2023/02/16 19:33:52 Using n1ql client
2023/02/16 19:33:52 Expected and Actual scan responses are the same
2023/02/16 19:33:52 
--- ScanAllFilterNil ---
2023/02/16 19:33:52 distinct = false
2023/02/16 19:33:52 Using n1ql client
2023/02/16 19:33:52 Expected and Actual scan responses are the same
2023/02/16 19:33:52 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:33:52 distinct = false
2023/02/16 19:33:53 Using n1ql client
2023/02/16 19:33:53 Expected and Actual scan responses are the same
2023/02/16 19:33:53 
--- 3FieldsSingleSeek ---
2023/02/16 19:33:53 distinct = false
2023/02/16 19:33:53 Using n1ql client
2023/02/16 19:33:53 Expected and Actual scan responses are the same
2023/02/16 19:33:53 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:33:53 distinct = false
2023/02/16 19:33:54 Using n1ql client
2023/02/16 19:33:54 Expected and Actual scan responses are the same
2023/02/16 19:33:54 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:33:54 distinct = false
2023/02/16 19:33:54 Using n1ql client
2023/02/16 19:33:54 Expected and Actual scan responses are the same
2023/02/16 19:33:54 

--------- New scenarios ---------
2023/02/16 19:33:54 
--- CompIndexHighUnbounded1 ---
2023/02/16 19:33:54 
--- Multi Scan 0 ---
2023/02/16 19:33:54 distinct = false
2023/02/16 19:33:55 Using n1ql client
2023/02/16 19:33:55 Expected and Actual scan responses are the same
2023/02/16 19:33:55 
--- Multi Scan 1 ---
2023/02/16 19:33:55 distinct = false
2023/02/16 19:33:55 Using n1ql client
2023/02/16 19:33:55 Expected and Actual scan responses are the same
2023/02/16 19:33:55 
--- Multi Scan 2 ---
2023/02/16 19:33:55 distinct = false
2023/02/16 19:33:55 Using n1ql client
2023/02/16 19:33:55 Expected and Actual scan responses are the same
2023/02/16 19:33:55 
--- CompIndexHighUnbounded2 ---
2023/02/16 19:33:55 
--- Multi Scan 0 ---
2023/02/16 19:33:55 distinct = false
2023/02/16 19:33:56 Using n1ql client
2023/02/16 19:33:56 Expected and Actual scan responses are the same
2023/02/16 19:33:56 
--- Multi Scan 1 ---
2023/02/16 19:33:56 distinct = false
2023/02/16 19:33:56 Using n1ql client
2023/02/16 19:33:56 Expected and Actual scan responses are the same
2023/02/16 19:33:56 
--- Multi Scan 2 ---
2023/02/16 19:33:56 distinct = false
2023/02/16 19:33:57 Using n1ql client
2023/02/16 19:33:57 Expected and Actual scan responses are the same
2023/02/16 19:33:57 
--- CompIndexHighUnbounded3 ---
2023/02/16 19:33:57 
--- Multi Scan 0 ---
2023/02/16 19:33:57 distinct = false
2023/02/16 19:33:57 Using n1ql client
2023/02/16 19:33:57 Expected and Actual scan responses are the same
2023/02/16 19:33:57 
--- CompIndexHighUnbounded4 ---
2023/02/16 19:33:57 
--- Multi Scan 0 ---
2023/02/16 19:33:57 distinct = false
2023/02/16 19:33:57 Using n1ql client
2023/02/16 19:33:57 Expected and Actual scan responses are the same
2023/02/16 19:33:57 
--- CompIndexHighUnbounded5 ---
2023/02/16 19:33:57 
--- Multi Scan 0 ---
2023/02/16 19:33:57 distinct = false
2023/02/16 19:33:58 Using n1ql client
2023/02/16 19:33:58 Expected and Actual scan responses are the same
2023/02/16 19:33:58 
--- SeekBoundaries ---
2023/02/16 19:33:58 
--- Multi Scan 0 ---
2023/02/16 19:33:58 distinct = false
2023/02/16 19:33:58 Using n1ql client
2023/02/16 19:33:58 Expected and Actual scan responses are the same
2023/02/16 19:33:58 
--- Multi Scan 1 ---
2023/02/16 19:33:58 distinct = false
2023/02/16 19:33:59 Using n1ql client
2023/02/16 19:33:59 Expected and Actual scan responses are the same
2023/02/16 19:33:59 
--- Multi Scan 2 ---
2023/02/16 19:33:59 distinct = false
2023/02/16 19:34:00 Using n1ql client
2023/02/16 19:34:00 Expected and Actual scan responses are the same
2023/02/16 19:34:00 
--- Multi Scan 3 ---
2023/02/16 19:34:00 distinct = false
2023/02/16 19:34:00 Using n1ql client
2023/02/16 19:34:00 Expected and Actual scan responses are the same
2023/02/16 19:34:00 
--- Multi Scan 4 ---
2023/02/16 19:34:00 distinct = false
2023/02/16 19:34:01 Using n1ql client
2023/02/16 19:34:01 Expected and Actual scan responses are the same
2023/02/16 19:34:01 
--- Multi Scan 5 ---
2023/02/16 19:34:01 distinct = false
2023/02/16 19:34:01 Using n1ql client
2023/02/16 19:34:01 Expected and Actual scan responses are the same
2023/02/16 19:34:01 
--- Multi Scan 6 ---
2023/02/16 19:34:01 distinct = false
2023/02/16 19:34:02 Using n1ql client
2023/02/16 19:34:02 Expected and Actual scan responses are the same
2023/02/16 19:34:02 
--- Multi Scan 7 ---
2023/02/16 19:34:02 distinct = false
2023/02/16 19:34:02 Using n1ql client
2023/02/16 19:34:02 Expected and Actual scan responses are the same
--- PASS: TestMultiScanDescScenarios (19.97s)
=== RUN   TestMultiScanDescCount
2023/02/16 19:34:02 In TestMultiScanDescCount()
2023/02/16 19:34:02 

--------- Composite Index with 2 fields ---------
2023/02/16 19:34:02 
--- ScanAllNoFilter ---
2023/02/16 19:34:02 distinct = false
2023/02/16 19:34:03 Using n1ql client
2023/02/16 19:34:03 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:03 
--- ScanAllFilterNil ---
2023/02/16 19:34:03 distinct = false
2023/02/16 19:34:03 Using n1ql client
2023/02/16 19:34:03 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:03 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:03 distinct = false
2023/02/16 19:34:03 Using n1ql client
2023/02/16 19:34:03 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:03 
--- SingleSeek ---
2023/02/16 19:34:03 distinct = false
2023/02/16 19:34:04 Using n1ql client
2023/02/16 19:34:04 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:34:04 
--- MultipleSeek ---
2023/02/16 19:34:04 distinct = false
2023/02/16 19:34:04 Using n1ql client
2023/02/16 19:34:04 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:34:04 
--- SimpleRange ---
2023/02/16 19:34:04 distinct = false
2023/02/16 19:34:05 Using n1ql client
2023/02/16 19:34:05 MultiScanCount = 2273 ExpectedMultiScanCount = 2273
2023/02/16 19:34:05 
--- NonOverlappingRanges ---
2023/02/16 19:34:05 distinct = false
2023/02/16 19:34:05 Using n1ql client
2023/02/16 19:34:05 MultiScanCount = 4283 ExpectedMultiScanCount = 4283
2023/02/16 19:34:05 
--- OverlappingRanges ---
2023/02/16 19:34:05 distinct = false
2023/02/16 19:34:05 Using n1ql client
2023/02/16 19:34:05 MultiScanCount = 5756 ExpectedMultiScanCount = 5756
2023/02/16 19:34:05 
--- NonOverlappingFilters ---
2023/02/16 19:34:05 distinct = false
2023/02/16 19:34:06 Using n1ql client
2023/02/16 19:34:06 MultiScanCount = 337 ExpectedMultiScanCount = 337
2023/02/16 19:34:06 
--- OverlappingFilters ---
2023/02/16 19:34:06 distinct = false
2023/02/16 19:34:06 Using n1ql client
2023/02/16 19:34:06 MultiScanCount = 2559 ExpectedMultiScanCount = 2559
2023/02/16 19:34:06 
--- BoundaryFilters ---
2023/02/16 19:34:06 distinct = false
2023/02/16 19:34:07 Using n1ql client
2023/02/16 19:34:07 MultiScanCount = 499 ExpectedMultiScanCount = 499
2023/02/16 19:34:07 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:34:07 distinct = false
2023/02/16 19:34:07 Using n1ql client
2023/02/16 19:34:07 MultiScanCount = 256 ExpectedMultiScanCount = 256
2023/02/16 19:34:07 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:34:07 distinct = false
2023/02/16 19:34:07 Using n1ql client
2023/02/16 19:34:07 MultiScanCount = 255 ExpectedMultiScanCount = 255
2023/02/16 19:34:07 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:34:07 distinct = false
2023/02/16 19:34:08 Using n1ql client
2023/02/16 19:34:08 MultiScanCount = 5618 ExpectedMultiScanCount = 5618
2023/02/16 19:34:08 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:34:08 distinct = false
2023/02/16 19:34:08 Using n1ql client
2023/02/16 19:34:08 MultiScanCount = 3704 ExpectedMultiScanCount = 3704
2023/02/16 19:34:08 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:34:08 distinct = false
2023/02/16 19:34:09 Using n1ql client
2023/02/16 19:34:09 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:09 
--- FiltersWithUnbounded ---
2023/02/16 19:34:09 distinct = false
2023/02/16 19:34:09 Using n1ql client
2023/02/16 19:34:09 MultiScanCount = 3173 ExpectedMultiScanCount = 3173
2023/02/16 19:34:09 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:34:09 distinct = false
2023/02/16 19:34:09 Using n1ql client
2023/02/16 19:34:09 MultiScanCount = 418 ExpectedMultiScanCount = 418
2023/02/16 19:34:09 

--------- Simple Index with 1 field ---------
2023/02/16 19:34:09 
--- SingleIndexSimpleRange ---
2023/02/16 19:34:09 distinct = false
2023/02/16 19:34:10 Using n1ql client
2023/02/16 19:34:10 MultiScanCount = 2273 ExpectedMultiScanCount = 2273
2023/02/16 19:34:10 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:34:10 distinct = false
2023/02/16 19:34:10 Using n1ql client
2023/02/16 19:34:10 MultiScanCount = 7140 ExpectedMultiScanCount = 7140
2023/02/16 19:34:10 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:34:10 distinct = false
2023/02/16 19:34:10 Using n1ql client
2023/02/16 19:34:10 MultiScanCount = 8701 ExpectedMultiScanCount = 8701
2023/02/16 19:34:10 

--------- Composite Index with 3 fields ---------
2023/02/16 19:34:10 
--- ScanAllNoFilter ---
2023/02/16 19:34:10 distinct = false
2023/02/16 19:34:11 Using n1ql client
2023/02/16 19:34:11 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:11 
--- ScanAllFilterNil ---
2023/02/16 19:34:11 distinct = false
2023/02/16 19:34:11 Using n1ql client
2023/02/16 19:34:11 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:11 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:11 distinct = false
2023/02/16 19:34:12 Using n1ql client
2023/02/16 19:34:12 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:12 
--- 3FieldsSingleSeek ---
2023/02/16 19:34:12 distinct = false
2023/02/16 19:34:12 Using n1ql client
2023/02/16 19:34:12 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:34:12 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:34:12 distinct = false
2023/02/16 19:34:13 Using n1ql client
2023/02/16 19:34:13 MultiScanCount = 3 ExpectedMultiScanCount = 3
2023/02/16 19:34:13 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:34:13 distinct = false
2023/02/16 19:34:13 Using n1ql client
2023/02/16 19:34:13 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:34:13 

--------- New scenarios ---------
2023/02/16 19:34:13 
--- CompIndexHighUnbounded1 ---
2023/02/16 19:34:13 
--- Multi Scan 0 ---
2023/02/16 19:34:13 distinct = false
2023/02/16 19:34:13 Using n1ql client
2023/02/16 19:34:13 Using n1ql client
2023/02/16 19:34:13 len(scanResults) = 8 MultiScanCount = 8
2023/02/16 19:34:13 Expected and Actual scan responses are the same
2023/02/16 19:34:13 
--- Multi Scan 1 ---
2023/02/16 19:34:13 distinct = false
2023/02/16 19:34:14 Using n1ql client
2023/02/16 19:34:14 Using n1ql client
2023/02/16 19:34:14 len(scanResults) = 0 MultiScanCount = 0
2023/02/16 19:34:14 Expected and Actual scan responses are the same
2023/02/16 19:34:14 
--- Multi Scan 2 ---
2023/02/16 19:34:14 distinct = false
2023/02/16 19:34:14 Using n1ql client
2023/02/16 19:34:14 Using n1ql client
2023/02/16 19:34:14 len(scanResults) = 9 MultiScanCount = 9
2023/02/16 19:34:14 Expected and Actual scan responses are the same
2023/02/16 19:34:14 
--- CompIndexHighUnbounded2 ---
2023/02/16 19:34:14 
--- Multi Scan 0 ---
2023/02/16 19:34:14 distinct = false
2023/02/16 19:34:15 Using n1ql client
2023/02/16 19:34:15 Using n1ql client
2023/02/16 19:34:15 len(scanResults) = 4138 MultiScanCount = 4138
2023/02/16 19:34:15 Expected and Actual scan responses are the same
2023/02/16 19:34:15 
--- Multi Scan 1 ---
2023/02/16 19:34:15 distinct = false
2023/02/16 19:34:15 Using n1ql client
2023/02/16 19:34:15 Using n1ql client
2023/02/16 19:34:15 len(scanResults) = 2746 MultiScanCount = 2746
2023/02/16 19:34:15 Expected and Actual scan responses are the same
2023/02/16 19:34:15 
--- Multi Scan 2 ---
2023/02/16 19:34:15 distinct = false
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 len(scanResults) = 4691 MultiScanCount = 4691
2023/02/16 19:34:16 Expected and Actual scan responses are the same
2023/02/16 19:34:16 
--- CompIndexHighUnbounded3 ---
2023/02/16 19:34:16 
--- Multi Scan 0 ---
2023/02/16 19:34:16 distinct = false
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 len(scanResults) = 1329 MultiScanCount = 1329
2023/02/16 19:34:16 Expected and Actual scan responses are the same
2023/02/16 19:34:16 
--- CompIndexHighUnbounded4 ---
2023/02/16 19:34:16 
--- Multi Scan 0 ---
2023/02/16 19:34:16 distinct = false
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 Using n1ql client
2023/02/16 19:34:16 len(scanResults) = 5349 MultiScanCount = 5349
2023/02/16 19:34:16 Expected and Actual scan responses are the same
2023/02/16 19:34:16 
--- CompIndexHighUnbounded5 ---
2023/02/16 19:34:16 
--- Multi Scan 0 ---
2023/02/16 19:34:16 distinct = false
2023/02/16 19:34:17 Using n1ql client
2023/02/16 19:34:17 Using n1ql client
2023/02/16 19:34:17 len(scanResults) = 8210 MultiScanCount = 8210
2023/02/16 19:34:17 Expected and Actual scan responses are the same
2023/02/16 19:34:17 
--- SeekBoundaries ---
2023/02/16 19:34:17 
--- Multi Scan 0 ---
2023/02/16 19:34:17 distinct = false
2023/02/16 19:34:17 Using n1ql client
2023/02/16 19:34:17 Using n1ql client
2023/02/16 19:34:17 len(scanResults) = 175 MultiScanCount = 175
2023/02/16 19:34:17 Expected and Actual scan responses are the same
2023/02/16 19:34:17 
--- Multi Scan 1 ---
2023/02/16 19:34:17 distinct = false
2023/02/16 19:34:18 Using n1ql client
2023/02/16 19:34:18 Using n1ql client
2023/02/16 19:34:18 len(scanResults) = 1 MultiScanCount = 1
2023/02/16 19:34:18 Expected and Actual scan responses are the same
2023/02/16 19:34:18 
--- Multi Scan 2 ---
2023/02/16 19:34:18 distinct = false
2023/02/16 19:34:18 Using n1ql client
2023/02/16 19:34:18 Using n1ql client
2023/02/16 19:34:18 len(scanResults) = 555 MultiScanCount = 555
2023/02/16 19:34:18 Expected and Actual scan responses are the same
2023/02/16 19:34:18 
--- Multi Scan 3 ---
2023/02/16 19:34:18 distinct = false
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 len(scanResults) = 872 MultiScanCount = 872
2023/02/16 19:34:19 Expected and Actual scan responses are the same
2023/02/16 19:34:19 
--- Multi Scan 4 ---
2023/02/16 19:34:19 distinct = false
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 len(scanResults) = 287 MultiScanCount = 287
2023/02/16 19:34:19 Expected and Actual scan responses are the same
2023/02/16 19:34:19 
--- Multi Scan 5 ---
2023/02/16 19:34:19 distinct = false
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 Using n1ql client
2023/02/16 19:34:19 len(scanResults) = 5254 MultiScanCount = 5254
2023/02/16 19:34:19 Expected and Actual scan responses are the same
2023/02/16 19:34:19 
--- Multi Scan 6 ---
2023/02/16 19:34:19 distinct = false
2023/02/16 19:34:20 Using n1ql client
2023/02/16 19:34:20 Using n1ql client
2023/02/16 19:34:20 len(scanResults) = 5566 MultiScanCount = 5566
2023/02/16 19:34:20 Expected and Actual scan responses are the same
2023/02/16 19:34:20 
--- Multi Scan 7 ---
2023/02/16 19:34:20 distinct = false
2023/02/16 19:34:20 Using n1ql client
2023/02/16 19:34:20 Using n1ql client
2023/02/16 19:34:20 len(scanResults) = 8 MultiScanCount = 8
2023/02/16 19:34:20 Expected and Actual scan responses are the same
2023/02/16 19:34:20 

--------- With DISTINCT True ---------
2023/02/16 19:34:20 
--- ScanAllNoFilter ---
2023/02/16 19:34:20 distinct = true
2023/02/16 19:34:21 Using n1ql client
2023/02/16 19:34:21 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:21 
--- ScanAllFilterNil ---
2023/02/16 19:34:21 distinct = true
2023/02/16 19:34:21 Using n1ql client
2023/02/16 19:34:21 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:21 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:21 distinct = true
2023/02/16 19:34:22 Using n1ql client
2023/02/16 19:34:22 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:22 
--- SingleSeek ---
2023/02/16 19:34:22 distinct = true
2023/02/16 19:34:22 Using n1ql client
2023/02/16 19:34:22 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:34:22 
--- MultipleSeek ---
2023/02/16 19:34:22 distinct = true
2023/02/16 19:34:22 Using n1ql client
2023/02/16 19:34:22 MultiScanCount = 2 ExpectedMultiScanCount = 2
2023/02/16 19:34:22 
--- SimpleRange ---
2023/02/16 19:34:22 distinct = true
2023/02/16 19:34:23 Using n1ql client
2023/02/16 19:34:23 MultiScanCount = 227 ExpectedMultiScanCount = 227
2023/02/16 19:34:23 
--- NonOverlappingRanges ---
2023/02/16 19:34:23 distinct = true
2023/02/16 19:34:23 Using n1ql client
2023/02/16 19:34:23 MultiScanCount = 428 ExpectedMultiScanCount = 428
2023/02/16 19:34:23 
--- NonOverlappingFilters2 ---
2023/02/16 19:34:23 distinct = true
2023/02/16 19:34:24 Using n1ql client
2023/02/16 19:34:24 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:34:24 
--- OverlappingRanges ---
2023/02/16 19:34:24 distinct = true
2023/02/16 19:34:24 Using n1ql client
2023/02/16 19:34:24 MultiScanCount = 575 ExpectedMultiScanCount = 575
2023/02/16 19:34:24 
--- NonOverlappingFilters ---
2023/02/16 19:34:24 distinct = true
2023/02/16 19:34:24 Using n1ql client
2023/02/16 19:34:24 MultiScanCount = 186 ExpectedMultiScanCount = 186
2023/02/16 19:34:24 
--- OverlappingFilters ---
2023/02/16 19:34:24 distinct = true
2023/02/16 19:34:25 Using n1ql client
2023/02/16 19:34:25 MultiScanCount = 543 ExpectedMultiScanCount = 543
2023/02/16 19:34:25 
--- BoundaryFilters ---
2023/02/16 19:34:25 distinct = true
2023/02/16 19:34:25 Using n1ql client
2023/02/16 19:34:25 MultiScanCount = 172 ExpectedMultiScanCount = 172
2023/02/16 19:34:25 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:34:25 distinct = true
2023/02/16 19:34:26 Using n1ql client
2023/02/16 19:34:26 MultiScanCount = 135 ExpectedMultiScanCount = 135
2023/02/16 19:34:26 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:34:26 distinct = true
2023/02/16 19:34:26 Using n1ql client
2023/02/16 19:34:26 MultiScanCount = 134 ExpectedMultiScanCount = 134
2023/02/16 19:34:26 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:34:26 distinct = false
2023/02/16 19:34:26 Using n1ql client
2023/02/16 19:34:26 MultiScanCount = 5618 ExpectedMultiScanCount = 5618
2023/02/16 19:34:26 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:34:26 distinct = false
2023/02/16 19:34:27 Using n1ql client
2023/02/16 19:34:27 MultiScanCount = 3704 ExpectedMultiScanCount = 3704
2023/02/16 19:34:27 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:34:27 distinct = false
2023/02/16 19:34:27 Using n1ql client
2023/02/16 19:34:27 MultiScanCount = 10002 ExpectedMultiScanCount = 10002
2023/02/16 19:34:27 
--- FiltersWithUnbounded ---
2023/02/16 19:34:27 distinct = false
2023/02/16 19:34:28 Using n1ql client
2023/02/16 19:34:28 MultiScanCount = 3173 ExpectedMultiScanCount = 3173
2023/02/16 19:34:28 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:34:28 distinct = false
2023/02/16 19:34:28 Using n1ql client
2023/02/16 19:34:28 MultiScanCount = 418 ExpectedMultiScanCount = 418
2023/02/16 19:34:28 

--------- Simple Index with 1 field ---------
2023/02/16 19:34:28 
--- SingleIndexSimpleRange ---
2023/02/16 19:34:28 distinct = true
2023/02/16 19:34:28 Using n1ql client
2023/02/16 19:34:28 MultiScanCount = 227 ExpectedMultiScanCount = 227
2023/02/16 19:34:28 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:34:28 distinct = true
2023/02/16 19:34:29 Using n1ql client
2023/02/16 19:34:29 MultiScanCount = 713 ExpectedMultiScanCount = 713
2023/02/16 19:34:29 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:34:29 distinct = true
2023/02/16 19:34:29 Using n1ql client
2023/02/16 19:34:29 MultiScanCount = 869 ExpectedMultiScanCount = 869
2023/02/16 19:34:29 

--------- Composite Index with 3 fields ---------
2023/02/16 19:34:29 
--- ScanAllNoFilter ---
2023/02/16 19:34:29 distinct = true
2023/02/16 19:34:30 Using n1ql client
2023/02/16 19:34:30 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:30 
--- ScanAllFilterNil ---
2023/02/16 19:34:30 distinct = true
2023/02/16 19:34:30 Using n1ql client
2023/02/16 19:34:30 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:30 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:30 distinct = true
2023/02/16 19:34:31 Using n1ql client
2023/02/16 19:34:31 MultiScanCount = 999 ExpectedMultiScanCount = 999
2023/02/16 19:34:31 
--- 3FieldsSingleSeek ---
2023/02/16 19:34:31 distinct = true
2023/02/16 19:34:31 Using n1ql client
2023/02/16 19:34:31 MultiScanCount = 1 ExpectedMultiScanCount = 1
2023/02/16 19:34:31 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:34:31 distinct = true
2023/02/16 19:34:31 Using n1ql client
2023/02/16 19:34:31 MultiScanCount = 3 ExpectedMultiScanCount = 3
2023/02/16 19:34:31 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:34:31 distinct = true
2023/02/16 19:34:32 Using n1ql client
2023/02/16 19:34:32 MultiScanCount = 2 ExpectedMultiScanCount = 2
--- PASS: TestMultiScanDescCount (29.79s)
=== RUN   TestMultiScanDescOffset
2023/02/16 19:34:32 In SkipTestMultiScanDescOffset()
2023/02/16 19:34:32 

--------- Composite Index with 2 fields ---------
2023/02/16 19:34:32 
--- ScanAllNoFilter ---
2023/02/16 19:34:32 distinct = false
2023/02/16 19:34:32 Using n1ql client
2023/02/16 19:34:32 
--- ScanAllFilterNil ---
2023/02/16 19:34:32 distinct = false
2023/02/16 19:34:33 Using n1ql client
2023/02/16 19:34:33 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:33 distinct = false
2023/02/16 19:34:33 Using n1ql client
2023/02/16 19:34:33 
--- SingleSeek ---
2023/02/16 19:34:33 distinct = false
2023/02/16 19:34:34 Using n1ql client
2023/02/16 19:34:34 
--- MultipleSeek ---
2023/02/16 19:34:34 distinct = false
2023/02/16 19:34:34 Using n1ql client
2023/02/16 19:34:34 
--- SimpleRange ---
2023/02/16 19:34:34 distinct = false
2023/02/16 19:34:34 Using n1ql client
2023/02/16 19:34:34 
--- NonOverlappingRanges ---
2023/02/16 19:34:34 distinct = false
2023/02/16 19:34:35 Using n1ql client
2023/02/16 19:34:35 
--- OverlappingRanges ---
2023/02/16 19:34:35 distinct = false
2023/02/16 19:34:35 Using n1ql client
2023/02/16 19:34:35 
--- NonOverlappingFilters ---
2023/02/16 19:34:35 distinct = false
2023/02/16 19:34:36 Using n1ql client
2023/02/16 19:34:36 
--- OverlappingFilters ---
2023/02/16 19:34:36 distinct = false
2023/02/16 19:34:36 Using n1ql client
2023/02/16 19:34:36 
--- BoundaryFilters ---
2023/02/16 19:34:36 distinct = false
2023/02/16 19:34:36 Using n1ql client
2023/02/16 19:34:36 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:34:36 distinct = false
2023/02/16 19:34:37 Using n1ql client
2023/02/16 19:34:37 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:34:37 distinct = false
2023/02/16 19:34:37 Using n1ql client
2023/02/16 19:34:37 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:34:37 distinct = false
2023/02/16 19:34:38 Using n1ql client
2023/02/16 19:34:38 Expected and Actual scan responses are the same
2023/02/16 19:34:38 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:34:38 distinct = false
2023/02/16 19:34:38 Using n1ql client
2023/02/16 19:34:38 Expected and Actual scan responses are the same
2023/02/16 19:34:38 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:34:38 distinct = false
2023/02/16 19:34:38 Using n1ql client
2023/02/16 19:34:39 Expected and Actual scan responses are the same
2023/02/16 19:34:39 
--- FiltersWithUnbounded ---
2023/02/16 19:34:39 distinct = false
2023/02/16 19:34:39 Using n1ql client
2023/02/16 19:34:39 Expected and Actual scan responses are the same
2023/02/16 19:34:39 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:34:39 distinct = false
2023/02/16 19:34:39 Using n1ql client
2023/02/16 19:34:39 Expected and Actual scan responses are the same
2023/02/16 19:34:39 

--------- Simple Index with 1 field ---------
2023/02/16 19:34:39 
--- SingleIndexSimpleRange ---
2023/02/16 19:34:39 distinct = false
2023/02/16 19:34:40 Using n1ql client
2023/02/16 19:34:40 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:34:40 distinct = false
2023/02/16 19:34:40 Using n1ql client
2023/02/16 19:34:40 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:34:40 distinct = false
2023/02/16 19:34:40 Using n1ql client
2023/02/16 19:34:40 

--------- Composite Index with 3 fields ---------
2023/02/16 19:34:40 
--- ScanAllNoFilter ---
2023/02/16 19:34:40 distinct = false
2023/02/16 19:34:41 Using n1ql client
2023/02/16 19:34:41 
--- ScanAllFilterNil ---
2023/02/16 19:34:41 distinct = false
2023/02/16 19:34:41 Using n1ql client
2023/02/16 19:34:41 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:41 distinct = false
2023/02/16 19:34:42 Using n1ql client
2023/02/16 19:34:42 
--- 3FieldsSingleSeek ---
2023/02/16 19:34:42 distinct = false
2023/02/16 19:34:42 Using n1ql client
2023/02/16 19:34:42 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:34:42 distinct = false
2023/02/16 19:34:43 Using n1ql client
2023/02/16 19:34:43 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:34:43 distinct = false
2023/02/16 19:34:43 Using n1ql client
--- PASS: TestMultiScanDescOffset (11.25s)
=== RUN   TestMultiScanDescDistinct
2023/02/16 19:34:43 In SkipTestMultiScanDescDistinct()
2023/02/16 19:34:43 

--------- Composite Index with 2 fields ---------
2023/02/16 19:34:43 
--- ScanAllNoFilter ---
2023/02/16 19:34:43 distinct = true
2023/02/16 19:34:44 Using n1ql client
2023/02/16 19:34:44 Expected and Actual scan responses are the same
2023/02/16 19:34:44 
--- ScanAllFilterNil ---
2023/02/16 19:34:44 distinct = true
2023/02/16 19:34:44 Using n1ql client
2023/02/16 19:34:44 Expected and Actual scan responses are the same
2023/02/16 19:34:44 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:44 distinct = true
2023/02/16 19:34:45 Using n1ql client
2023/02/16 19:34:45 Expected and Actual scan responses are the same
2023/02/16 19:34:45 
--- SingleSeek ---
2023/02/16 19:34:45 distinct = true
2023/02/16 19:34:45 Using n1ql client
2023/02/16 19:34:45 Expected and Actual scan responses are the same
2023/02/16 19:34:45 
--- MultipleSeek ---
2023/02/16 19:34:45 distinct = true
2023/02/16 19:34:45 Using n1ql client
2023/02/16 19:34:45 Expected and Actual scan responses are the same
2023/02/16 19:34:45 
--- SimpleRange ---
2023/02/16 19:34:45 distinct = true
2023/02/16 19:34:46 Using n1ql client
2023/02/16 19:34:46 Expected and Actual scan responses are the same
2023/02/16 19:34:46 
--- NonOverlappingRanges ---
2023/02/16 19:34:46 distinct = true
2023/02/16 19:34:46 Using n1ql client
2023/02/16 19:34:46 Expected and Actual scan responses are the same
2023/02/16 19:34:46 
--- OverlappingRanges ---
2023/02/16 19:34:46 distinct = true
2023/02/16 19:34:47 Using n1ql client
2023/02/16 19:34:47 Expected and Actual scan responses are the same
2023/02/16 19:34:47 
--- NonOverlappingFilters ---
2023/02/16 19:34:47 distinct = true
2023/02/16 19:34:47 Using n1ql client
2023/02/16 19:34:47 Expected and Actual scan responses are the same
2023/02/16 19:34:47 
--- OverlappingFilters ---
2023/02/16 19:34:47 distinct = true
2023/02/16 19:34:48 Using n1ql client
2023/02/16 19:34:48 Expected and Actual scan responses are the same
2023/02/16 19:34:48 
--- BoundaryFilters ---
2023/02/16 19:34:48 distinct = true
2023/02/16 19:34:48 Using n1ql client
2023/02/16 19:34:48 Expected and Actual scan responses are the same
2023/02/16 19:34:48 
--- SeekAndFilters_NonOverlapping ---
2023/02/16 19:34:48 distinct = true
2023/02/16 19:34:49 Using n1ql client
2023/02/16 19:34:49 Expected and Actual scan responses are the same
2023/02/16 19:34:49 
--- SeekAndFilters_Overlapping ---
2023/02/16 19:34:49 distinct = true
2023/02/16 19:34:49 Using n1ql client
2023/02/16 19:34:49 Expected and Actual scan responses are the same
2023/02/16 19:34:49 
--- SimpleRangeLowUnbounded ---
2023/02/16 19:34:49 distinct = false
2023/02/16 19:34:49 Using n1ql client
2023/02/16 19:34:49 Expected and Actual scan responses are the same
2023/02/16 19:34:49 
--- SimpleRangeHighUnbounded ---
2023/02/16 19:34:49 distinct = false
2023/02/16 19:34:50 Using n1ql client
2023/02/16 19:34:50 Expected and Actual scan responses are the same
2023/02/16 19:34:50 
--- SimpleRangeMultipleUnbounded ---
2023/02/16 19:34:50 distinct = false
2023/02/16 19:34:50 Using n1ql client
2023/02/16 19:34:50 Expected and Actual scan responses are the same
2023/02/16 19:34:50 
--- FiltersWithUnbounded ---
2023/02/16 19:34:50 distinct = false
2023/02/16 19:34:51 Using n1ql client
2023/02/16 19:34:51 Expected and Actual scan responses are the same
2023/02/16 19:34:51 
--- FiltersLowGreaterThanHigh ---
2023/02/16 19:34:51 distinct = false
2023/02/16 19:34:51 Using n1ql client
2023/02/16 19:34:51 Expected and Actual scan responses are the same
2023/02/16 19:34:51 

--------- Simple Index with 1 field ---------
2023/02/16 19:34:51 
--- SingleIndexSimpleRange ---
2023/02/16 19:34:51 distinct = true
2023/02/16 19:34:51 Using n1ql client
2023/02/16 19:34:51 Expected and Actual scan responses are the same
2023/02/16 19:34:51 
--- SingleIndex_SimpleRanges_NonOverlapping ---
2023/02/16 19:34:51 distinct = true
2023/02/16 19:34:52 Using n1ql client
2023/02/16 19:34:52 Expected and Actual scan responses are the same
2023/02/16 19:34:52 
--- SingleIndex_SimpleRanges_Overlapping ---
2023/02/16 19:34:52 distinct = true
2023/02/16 19:34:52 Using n1ql client
2023/02/16 19:34:52 Expected and Actual scan responses are the same
2023/02/16 19:34:52 

--------- Composite Index with 3 fields ---------
2023/02/16 19:34:52 
--- ScanAllNoFilter ---
2023/02/16 19:34:52 distinct = true
2023/02/16 19:34:53 Using n1ql client
2023/02/16 19:34:53 Expected and Actual scan responses are the same
2023/02/16 19:34:53 
--- ScanAllFilterNil ---
2023/02/16 19:34:53 distinct = true
2023/02/16 19:34:53 Using n1ql client
2023/02/16 19:34:53 Expected and Actual scan responses are the same
2023/02/16 19:34:53 
--- ScanAll_AllFiltersNil ---
2023/02/16 19:34:53 distinct = true
2023/02/16 19:34:54 Using n1ql client
2023/02/16 19:34:54 Expected and Actual scan responses are the same
2023/02/16 19:34:54 
--- 3FieldsSingleSeek ---
2023/02/16 19:34:54 distinct = true
2023/02/16 19:34:54 Using n1ql client
2023/02/16 19:34:54 Expected and Actual scan responses are the same
2023/02/16 19:34:54 
--- 3FieldsMultipleSeeks ---
2023/02/16 19:34:54 distinct = true
2023/02/16 19:34:55 Using n1ql client
2023/02/16 19:34:55 Expected and Actual scan responses are the same
2023/02/16 19:34:55 
--- 3FieldsMultipleSeeks_Identical ---
2023/02/16 19:34:55 distinct = true
2023/02/16 19:34:55 Using n1ql client
2023/02/16 19:34:55 Expected and Actual scan responses are the same
--- PASS: TestMultiScanDescDistinct (11.89s)
=== RUN   TestGroupAggrSetup
2023/02/16 19:34:55 In TestGroupAggrSetup()
2023/02/16 19:34:55 Emptying the default bucket
2023/02/16 19:34:58 Flush Enabled on bucket default, responseBody: 
2023/02/16 19:35:37 Flushed the bucket default, Response body: 
2023/02/16 19:35:37 Dropping the secondary index index_agg
2023/02/16 19:35:37 Populating the default bucket
2023/02/16 19:35:41 Created the secondary index index_agg. Waiting for it become active
2023/02/16 19:35:41 Index is 12494855080447363359 now active
--- PASS: TestGroupAggrSetup (52.25s)
=== RUN   TestGroupAggrLeading
2023/02/16 19:35:47 In TestGroupAggrLeading()
2023/02/16 19:35:47 Total Scanresults = 7
2023/02/16 19:35:47 Expected and Actual scan responses are the same
2023/02/16 19:35:47 Total Scanresults = 3
2023/02/16 19:35:47 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrLeading (0.01s)
=== RUN   TestGroupAggrNonLeading
2023/02/16 19:35:47 In TestGroupAggrNonLeading()
2023/02/16 19:35:47 Total Scanresults = 4
2023/02/16 19:35:47 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrNonLeading (0.01s)
=== RUN   TestGroupAggrNoGroup
2023/02/16 19:35:47 In TestGroupAggrNoGroup()
2023/02/16 19:35:47 Total Scanresults = 1
2023/02/16 19:35:47 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrNoGroup (0.03s)
=== RUN   TestGroupAggrMinMax
2023/02/16 19:35:47 In TestGroupAggrMinMax()
2023/02/16 19:35:47 Total Scanresults = 4
2023/02/16 19:35:47 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrMinMax (0.00s)
=== RUN   TestGroupAggrMinMax2
2023/02/16 19:35:47 In TestGroupAggrMinMax()
2023/02/16 19:35:47 Total Scanresults = 1
2023/02/16 19:35:47 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrMinMax2 (0.00s)
=== RUN   TestGroupAggrLeading_N1QLExprs
2023/02/16 19:35:47 In TestGroupAggrLeading_N1QLExprs()
2023/02/16 19:35:47 Total Scanresults = 17
2023/02/16 19:35:48 basicGroupAggrN1QLExprs1: Scan validation passed
2023/02/16 19:35:48 Total Scanresults = 9
2023/02/16 19:35:48 basicGroupAggrN1QLExprs2: Scan validation passed
--- PASS: TestGroupAggrLeading_N1QLExprs (0.24s)
=== RUN   TestGroupAggrLimit
2023/02/16 19:35:48 In TestGroupAggrLimit()
2023/02/16 19:35:48 Total Scanresults = 3
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrLimit (0.00s)
=== RUN   TestGroupAggrOffset
2023/02/16 19:35:48 In TestGroupAggrOffset()
2023/02/16 19:35:48 Total Scanresults = 3
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrOffset (0.00s)
=== RUN   TestGroupAggrCountN
2023/02/16 19:35:48 In TestGroupAggrCountN()
2023/02/16 19:35:48 Total Scanresults = 4
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 4
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrCountN (0.00s)
=== RUN   TestGroupAggrNoGroupNoMatch
2023/02/16 19:35:48 In TestGroupAggrNoGroupNoMatch()
2023/02/16 19:35:48 Total Scanresults = 1
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrNoGroupNoMatch (0.00s)
=== RUN   TestGroupAggrGroupNoMatch
2023/02/16 19:35:48 In TestGroupAggrGroupNoMatch()
2023/02/16 19:35:48 Total Scanresults = 0
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrGroupNoMatch (0.00s)
=== RUN   TestGroupAggrMultDataTypes
2023/02/16 19:35:48 In TestGroupAggrMultDataTypes()
2023/02/16 19:35:48 Total Scanresults = 8
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrMultDataTypes (0.00s)
=== RUN   TestGroupAggrDistinct
2023/02/16 19:35:48 In TestGroupAggrDistinct()
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrDistinct (0.00s)
=== RUN   TestGroupAggrDistinct2
2023/02/16 19:35:48 In TestGroupAggrDistinct2()
2023/02/16 19:35:48 Total Scanresults = 1
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 4
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 4
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrDistinct2 (0.01s)
=== RUN   TestGroupAggrNull
2023/02/16 19:35:48 In TestGroupAggrNull()
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrNull (0.00s)
=== RUN   TestGroupAggrInt64
2023/02/16 19:35:48 In TestGroupAggrInt64()
2023/02/16 19:35:48 Updating the default bucket
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
2023/02/16 19:35:48 Total Scanresults = 2
2023/02/16 19:35:48 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrInt64 (0.20s)
=== RUN   TestGroupAggr1
2023/02/16 19:35:48 In TestGroupAggr1()
2023/02/16 19:35:48 In DropAllSecondaryIndexes()
2023/02/16 19:35:48 Index found:  index_agg
2023/02/16 19:35:48 Dropped index index_agg
2023/02/16 19:35:48 Index found:  index_company_name_age_desc
2023/02/16 19:35:48 Dropped index index_company_name_age_desc
2023/02/16 19:35:48 Index found:  index_companyname_desc
2023/02/16 19:35:48 Dropped index index_companyname_desc
2023/02/16 19:35:48 Index found:  #primary
2023/02/16 19:35:48 Dropped index #primary
2023/02/16 19:35:48 Index found:  index_company_desc
2023/02/16 19:35:48 Dropped index index_company_desc
2023/02/16 19:36:26 Flushed the bucket default, Response body: 
2023/02/16 19:36:37 Created the secondary index idx_aggrs. Waiting for it become active
2023/02/16 19:36:37 Index is 6770138824726068961 now active
2023/02/16 19:36:37 Total Scanresults = 633
2023/02/16 19:36:40 Total Scanresults = 743
--- PASS: TestGroupAggr1 (52.50s)
=== RUN   TestGroupAggrArrayIndex
2023/02/16 19:36:40 In TestGroupAggrArrayIndex()
2023/02/16 19:36:44 Created the secondary index ga_arr1. Waiting for it become active
2023/02/16 19:36:44 Index is 8189962899337839048 now active
2023/02/16 19:36:51 Created the secondary index ga_arr2. Waiting for it become active
2023/02/16 19:36:51 Index is 9110542512034721717 now active
2023/02/16 19:36:51 Scenario 1
2023/02/16 19:36:51 Total Scanresults = 633
2023/02/16 19:36:52 Scenario 2
2023/02/16 19:36:52 Total Scanresults = 2824
2023/02/16 19:36:53 Scenario 3
2023/02/16 19:36:53 Total Scanresults = 1
2023/02/16 19:36:53 Scenario 4
2023/02/16 19:36:53 Total Scanresults = 992
2023/02/16 19:36:54 Scenario 5
2023/02/16 19:36:54 Total Scanresults = 2824
2023/02/16 19:36:54 Scenario 6
2023/02/16 19:36:54 Total Scanresults = 1
2023/02/16 19:36:55 Scenario 7
2023/02/16 19:36:55 Total Scanresults = 2929
2023/02/16 19:36:58 Scenario 8
2023/02/16 19:36:58 Total Scanresults = 1171
2023/02/16 19:36:58 Scenario 9
2023/02/16 19:36:58 Total Scanresults = 1
2023/02/16 19:36:59 Scenario 10
2023/02/16 19:36:59 Total Scanresults = 633
2023/02/16 19:37:00 Scenario 11
2023/02/16 19:37:00 Total Scanresults = 1171
2023/02/16 19:37:01 Scenario 12
2023/02/16 19:37:01 Total Scanresults = 1
2023/02/16 19:37:01 Scenario 13
2023/02/16 19:37:05 Total Scanresults = 1
2023/02/16 19:37:05 Count of scanResults is 1
2023/02/16 19:37:05 Value: [2 133]
--- PASS: TestGroupAggrArrayIndex (25.34s)
=== RUN   TestGroupAggr_FirstValidAggrOnly
2023/02/16 19:37:06 In TestGroupAggr_FirstValidAggrOnly()
2023/02/16 19:37:06 In DropAllSecondaryIndexes()
2023/02/16 19:37:06 Index found:  ga_arr1
2023/02/16 19:37:06 Dropped index ga_arr1
2023/02/16 19:37:06 Index found:  PRIMARY_IDX_CBO_STATS
2023/02/16 19:37:06 Dropped index PRIMARY_IDX_CBO_STATS
2023/02/16 19:37:06 Index found:  test_oneperprimarykey
2023/02/16 19:37:06 Dropped index test_oneperprimarykey
2023/02/16 19:37:06 Index found:  #primary
2023/02/16 19:37:06 Dropped index #primary
2023/02/16 19:37:06 Index found:  idx_aggrs
2023/02/16 19:37:06 Dropped index idx_aggrs
2023/02/16 19:37:06 Index found:  ga_arr2
2023/02/16 19:37:06 Dropped index ga_arr2
2023/02/16 19:37:21 Created the secondary index idx_asc_3field. Waiting for it become active
2023/02/16 19:37:21 Index is 18043762484443045466 now active
2023/02/16 19:37:27 Created the secondary index idx_desc_3field. Waiting for it become active
2023/02/16 19:37:27 Index is 12709144341168171901 now active
2023/02/16 19:37:27 === MIN no group by ===
2023/02/16 19:37:27 Total Scanresults = 1
2023/02/16 19:37:27 Count of scanResults is 1
2023/02/16 19:37:27 Value: ["ACCEL"]
2023/02/16 19:37:27 === MIN no group by, no row match ===
2023/02/16 19:37:27 Total Scanresults = 1
2023/02/16 19:37:27 Count of scanResults is 1
2023/02/16 19:37:27 Value: [null]
2023/02/16 19:37:27 === MIN with group by ===
2023/02/16 19:37:27 Total Scanresults = 633
2023/02/16 19:37:28 === MIN with group by, no row match ===
2023/02/16 19:37:28 Total Scanresults = 0
2023/02/16 19:37:28 === One Aggr, no group by ===
2023/02/16 19:37:28 Total Scanresults = 1
2023/02/16 19:37:28 Count of scanResults is 1
2023/02/16 19:37:28 Value: ["FANFARE"]
2023/02/16 19:37:28 === One Aggr, no group by, no row match ===
2023/02/16 19:37:28 Total Scanresults = 1
2023/02/16 19:37:28 Count of scanResults is 1
2023/02/16 19:37:28 Value: [null]
2023/02/16 19:37:28 === Multiple Aggr, no group by ===
2023/02/16 19:37:28 Total Scanresults = 1
2023/02/16 19:37:28 Count of scanResults is 1
2023/02/16 19:37:28 Value: ["FANFARE" 15]
2023/02/16 19:37:28 === Multiple Aggr, no group by, no row match ===
2023/02/16 19:37:28 Total Scanresults = 1
2023/02/16 19:37:28 Count of scanResults is 1
2023/02/16 19:37:28 Value: [null null]
2023/02/16 19:37:29 === No Aggr, 1 group by ===
2023/02/16 19:37:29 Total Scanresults = 207
2023/02/16 19:37:29 === Aggr on non-leading key, previous equality filter, no group ===
2023/02/16 19:37:29 Total Scanresults = 1
2023/02/16 19:37:29 Count of scanResults is 1
2023/02/16 19:37:29 Value: [17]
2023/02/16 19:37:29 === Aggr on non-leading key, previous equality filters, no group ===
2023/02/16 19:37:29 Total Scanresults = 1
2023/02/16 19:37:29 Count of scanResults is 1
2023/02/16 19:37:29 Value: [null]
2023/02/16 19:37:29 === Aggr on non-leading key, previous non-equality filters, no group ===
2023/02/16 19:37:29 Total Scanresults = 1
2023/02/16 19:37:29 Count of scanResults is 1
2023/02/16 19:37:29 Value: ["Ahmed"]
2023/02/16 19:37:29 === MIN on desc, no group ===
2023/02/16 19:37:29 Total Scanresults = 1
2023/02/16 19:37:29 Count of scanResults is 1
2023/02/16 19:37:29 Value: ["FANFARE"]
2023/02/16 19:37:30 === MAX on asc, no group ===
2023/02/16 19:37:30 Total Scanresults = 1
2023/02/16 19:37:30 Count of scanResults is 1
2023/02/16 19:37:30 Value: ["OZEAN"]
2023/02/16 19:37:30 === MAX on desc, no group ===
2023/02/16 19:37:30 Total Scanresults = 1
2023/02/16 19:37:30 Count of scanResults is 1
2023/02/16 19:37:30 Value: ["OZEAN"]
2023/02/16 19:37:30 === COUNT(DISTINCT const_expr, no group ===
2023/02/16 19:37:30 Total Scanresults = 1
2023/02/16 19:37:30 Count of scanResults is 1
2023/02/16 19:37:30 Value: [1]
2023/02/16 19:37:30 === COUNT(DISTINCT const_expr, no group, no row match ===
2023/02/16 19:37:30 Total Scanresults = 1
2023/02/16 19:37:30 Count of scanResults is 1
2023/02/16 19:37:30 Value: [0]
2023/02/16 19:37:30 === COUNT(const_expr, no group ===
2023/02/16 19:37:30 Total Scanresults = 1
2023/02/16 19:37:30 Count of scanResults is 1
2023/02/16 19:37:30 Value: [321]
2023/02/16 19:37:35 Created the secondary index indexMinAggr. Waiting for it become active
2023/02/16 19:37:35 Index is 12231539893468981012 now active
2023/02/16 19:37:35 === Equality filter check: Equality for a, nil filter for b - inclusion 0, no filter for c ===
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Count of scanResults is 1
2023/02/16 19:37:35 Value: [5]
2023/02/16 19:37:35 === Equality filter check: Equality for a, nil filter for b - inclusion 3, no filter for c ===
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Count of scanResults is 1
2023/02/16 19:37:35 Value: [5]
2023/02/16 19:37:35 === Equality filter check: Equality for a, nil filter for b - inclusion 3, nil filter for c ===
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Count of scanResults is 1
2023/02/16 19:37:35 Value: [5]
--- PASS: TestGroupAggr_FirstValidAggrOnly (29.65s)
=== RUN   TestGroupAggrPrimary
2023/02/16 19:37:35 In TestGroupAggrPrimary()
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Total Scanresults = 1
2023/02/16 19:37:35 Total Scanresults = 1002
2023/02/16 19:37:36 Total Scanresults = 1002
2023/02/16 19:37:36 Total Scanresults = 1
2023/02/16 19:37:36 Total Scanresults = 1
2023/02/16 19:37:36 Total Scanresults = 1
2023/02/16 19:37:36 Total Scanresults = 1
2023/02/16 19:37:37 Total Scanresults = 1
2023/02/16 19:37:37 Total Scanresults = 1
2023/02/16 19:37:37 --- MB-28305 Scenario 1 ---
2023/02/16 19:37:37 Total Scanresults = 1
2023/02/16 19:37:37 Count of scanResults is 1
2023/02/16 19:37:37 Value: [0]
2023/02/16 19:37:37 --- MB-28305 Scenario 2 ---
2023/02/16 19:37:37 Total Scanresults = 1
2023/02/16 19:37:37 Count of scanResults is 1
2023/02/16 19:37:37 Value: [0]
2023/02/16 19:37:37 --- MB-28305 Scenario 3 ---
2023/02/16 19:37:37 Total Scanresults = 1
2023/02/16 19:37:37 Count of scanResults is 1
2023/02/16 19:37:37 Value: [0]
--- PASS: TestGroupAggrPrimary (1.52s)
=== RUN   TestGroupAggrDocumentKey
2023/02/16 19:37:37 In TestGroupAggrDocumentKey()
2023/02/16 19:37:37 Dropping the secondary index documentkey_idx1
2023/02/16 19:37:37 Dropping the secondary index documentkey_idx2
2023/02/16 19:37:37 Populating the default bucket for TestGroupAggrDocumentKey single key index
2023/02/16 19:37:41 Created the secondary index documentkey_idx1. Waiting for it become active
2023/02/16 19:37:41 Index is 9323832577780813173 now active
2023/02/16 19:37:41 Using n1ql client
2023/02/16 19:37:41 Scanresult Row  ["1"] :  
2023/02/16 19:37:41 Scanresult Row  ["2"] :  
2023/02/16 19:37:41 Scanresult Row  ["3"] :  
2023/02/16 19:37:41 Expected and Actual scan responses are the same
2023/02/16 19:37:41 Populating the default bucket for TestGroupAggrDocumentKey composite key index
2023/02/16 19:37:47 Created the secondary index documentkey_idx2. Waiting for it become active
2023/02/16 19:37:47 Index is 6225828538269969734 now active
2023/02/16 19:37:47 Using n1ql client
2023/02/16 19:37:48 Scanresult Row  ["1"] :  
2023/02/16 19:37:48 Scanresult Row  ["2"] :  
2023/02/16 19:37:48 Scanresult Row  ["3"] :  
2023/02/16 19:37:49 Expected and Actual scan responses are the same
--- PASS: TestGroupAggrDocumentKey (11.79s)
=== RUN   TestRebalanceSetupCluster
2023/02/16 19:37:49 set14_rebalance_test.go::TestRebalanceSetupCluster: entry: Current cluster configuration: map[127.0.0.1:9001:[index kv] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:37:49 set14_rebalance_test.go::TestRebalanceSetupCluster: 1. Setting up initial cluster configuration
2023/02/16 19:37:49 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 19:37:54 Rebalance progress: 2.376302083333333
2023/02/16 19:37:59 Rebalance progress: 6.4453125
2023/02/16 19:38:04 Rebalance progress: 9.244791666666668
2023/02/16 19:38:09 Rebalance progress: 12.85807291666667
2023/02/16 19:38:14 Rebalance progress: 17.70833333333333
2023/02/16 19:38:19 Rebalance progress: 22.63997395833333
2023/02/16 19:38:24 Rebalance progress: 25
2023/02/16 19:38:29 Rebalance progress: 25
2023/02/16 19:38:35 Rebalance progress: 25.71614583333333
2023/02/16 19:38:39 Rebalance progress: 28.515625
2023/02/16 19:38:44 Rebalance progress: 31.25
2023/02/16 19:38:49 Rebalance progress: 33.69140625
2023/02/16 19:38:54 Rebalance progress: 36.45833333333333
2023/02/16 19:38:59 Rebalance progress: 39.12760416666667
2023/02/16 19:39:04 Rebalance progress: 40.88541666666667
2023/02/16 19:39:09 Rebalance progress: 43.96158854166666
2023/02/16 19:39:14 Rebalance progress: 47.216796875
2023/02/16 19:39:19 Rebalance progress: 50
2023/02/16 19:39:24 Rebalance progress: 50
2023/02/16 19:39:29 Rebalance progress: 50
2023/02/16 19:39:34 Rebalance progress: 100
2023/02/16 19:39:34 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:39:45 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:39:51 Rebalance progress: 100
2023/02/16 19:39:51 set14_rebalance_test.go::TestRebalanceSetupCluster: 2. Changing indexer.settings.rebalance.redistribute_indexes to true
2023/02/16 19:39:51 Changing config key indexer.settings.rebalance.redistribute_indexes to value true
2023/02/16 19:39:51 set14_rebalance_test.go::TestRebalanceSetupCluster: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestRebalanceSetupCluster (122.22s)
=== RUN   TestCreateDocsBeforeRebalance
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateDocsBeforeRebalance: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateDocsBeforeRebalance: 1. Creating 100 documents
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateDocsBeforeRebalance: 100 documents created
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateDocsBeforeRebalance: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestCreateDocsBeforeRebalance (0.15s)
=== RUN   TestCreateIndexesBeforeRebalance
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: 1. Creating 17 indexes: non-partitioned, 0-replica, non-deferred
2023/02/16 19:39:51 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN__id on `default`(_id)
2023/02/16 19:39:55 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN__id index is now active.
2023/02/16 19:39:55 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_docid on `default`(docid)
2023/02/16 19:40:01 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_docid index is now active.
2023/02/16 19:40:01 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_guid on `default`(guid)
2023/02/16 19:40:07 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_guid index is now active.
2023/02/16 19:40:07 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_isActive on `default`(isActive)
2023/02/16 19:40:13 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_isActive index is now active.
2023/02/16 19:40:13 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_balance on `default`(balance)
2023/02/16 19:40:19 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_balance index is now active.
2023/02/16 19:40:19 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_picture on `default`(picture)
2023/02/16 19:40:25 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_picture index is now active.
2023/02/16 19:40:25 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_age on `default`(age)
2023/02/16 19:40:31 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_age index is now active.
2023/02/16 19:40:31 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_eyeColor on `default`(eyeColor)
2023/02/16 19:40:37 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_eyeColor index is now active.
2023/02/16 19:40:37 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_name on `default`(name)
2023/02/16 19:40:43 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_name index is now active.
2023/02/16 19:40:43 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_gender on `default`(gender)
2023/02/16 19:40:50 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_gender index is now active.
2023/02/16 19:40:50 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_company on `default`(company)
2023/02/16 19:40:56 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_company index is now active.
2023/02/16 19:40:56 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_email on `default`(email)
2023/02/16 19:41:02 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_email index is now active.
2023/02/16 19:41:02 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_phone on `default`(phone)
2023/02/16 19:41:08 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_phone index is now active.
2023/02/16 19:41:08 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_registered on `default`(registered)
2023/02/16 19:41:14 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_registered index is now active.
2023/02/16 19:41:14 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_latitude on `default`(latitude)
2023/02/16 19:41:20 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_latitude index is now active.
2023/02/16 19:41:20 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_longitude on `default`(longitude)
2023/02/16 19:41:26 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_longitude index is now active.
2023/02/16 19:41:26 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_PLAIN_favoriteFruit on `default`(favoriteFruit)
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_PLAIN_favoriteFruit index is now active.
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: 2. Creating 17 indexes: non-partitioned, 0-replica, DEFERRED
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED__id_docid on `default`(_id, docid) with {"defer_build":true}
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED__id_docid index is now deferred.
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_docid_guid on `default`(docid, guid) with {"defer_build":true}
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_docid_guid index is now deferred.
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_guid_isActive on `default`(guid, isActive) with {"defer_build":true}
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_guid_isActive index is now deferred.
2023/02/16 19:41:32 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_isActive_balance on `default`(isActive, balance) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_isActive_balance index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_balance_picture on `default`(balance, picture) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_balance_picture index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_picture_age on `default`(picture, age) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_picture_age index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_age_eyeColor on `default`(age, eyeColor) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_age_eyeColor index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_eyeColor_name on `default`(eyeColor, name) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_eyeColor_name index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_name_gender on `default`(name, gender) with {"defer_build":true}
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_name_gender index is now deferred.
2023/02/16 19:41:33 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_gender_company on `default`(gender, company) with {"defer_build":true}
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_gender_company index is now deferred.
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_company_email on `default`(company, email) with {"defer_build":true}
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_company_email index is now deferred.
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_email_phone on `default`(email, phone) with {"defer_build":true}
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_email_phone index is now deferred.
2023/02/16 19:41:34 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_phone_registered on `default`(phone, registered) with {"defer_build":true}
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_phone_registered index is now deferred.
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_registered_latitude on `default`(registered, latitude) with {"defer_build":true}
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_registered_latitude index is now deferred.
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_latitude_longitude on `default`(latitude, longitude) with {"defer_build":true}
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_latitude_longitude index is now deferred.
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_longitude_favoriteFruit on `default`(longitude, favoriteFruit) with {"defer_build":true}
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_longitude_favoriteFruit index is now deferred.
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_DEFERRED_favoriteFruit__id on `default`(favoriteFruit, _id) with {"defer_build":true}
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_DEFERRED_favoriteFruit__id index is now deferred.
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: 3. Creating 3 indexes: 7-PARTITION, 0-replica, non-deferred
2023/02/16 19:41:35 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_7PARTITIONS__id_guid on `default`(_id, guid) partition by hash(Meta().id) with {"num_partition":7}
2023/02/16 19:41:40 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_7PARTITIONS__id_guid index is now active.
2023/02/16 19:41:40 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_7PARTITIONS_docid_isActive on `default`(docid, isActive) partition by hash(Meta().id) with {"num_partition":7}
2023/02/16 19:41:46 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_7PARTITIONS_docid_isActive index is now active.
2023/02/16 19:41:46 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_7PARTITIONS_guid_balance on `default`(guid, balance) partition by hash(Meta().id) with {"num_partition":7}
2023/02/16 19:41:53 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: set14_idx_7PARTITIONS_guid_balance index is now active.
2023/02/16 19:41:53 set14_rebalance_test.go::TestCreateIndexesBeforeRebalance: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestCreateIndexesBeforeRebalance (122.13s)
=== RUN   TestIndexNodeRebalanceIn
2023/02/16 19:41:53 TestIndexNodeRebalanceIn entry: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:41:53 TestIndexNodeRebalanceIn: 1. Adding index node 127.0.0.1:9002 to the cluster
2023/02/16 19:41:53 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 19:42:02 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 19:42:02 TestIndexNodeRebalanceIn: 2. Adding index node 127.0.0.1:9003 to the cluster
2023/02/16 19:42:02 Adding node: https://127.0.0.1:19003 with role: index to the cluster
2023/02/16 19:42:10 AddNode: Successfully added node: 127.0.0.1:9003 (role index), response: {"otpNode":"n_3@127.0.0.1"}
2023/02/16 19:42:10 TestIndexNodeRebalanceIn: 3. Rebalancing
2023/02/16 19:42:16 Rebalance progress: 6.25
2023/02/16 19:42:21 Rebalance progress: 12.5
2023/02/16 19:42:26 Rebalance progress: 40
2023/02/16 19:42:31 Rebalance progress: 42.50000000000001
2023/02/16 19:42:36 Rebalance progress: 47.5
2023/02/16 19:42:41 Rebalance progress: 47.5
2023/02/16 19:42:46 Rebalance progress: 52.5
2023/02/16 19:42:51 Rebalance progress: 57.49999999999999
2023/02/16 19:42:56 Rebalance progress: 62.5
2023/02/16 19:43:01 Rebalance progress: 67.5
2023/02/16 19:43:06 Rebalance progress: 67.5
2023/02/16 19:43:11 Rebalance progress: 72.50000000000001
2023/02/16 19:43:16 Rebalance progress: 77.5
2023/02/16 19:43:21 Rebalance progress: 100
2023/02/16 19:43:21 TestIndexNodeRebalanceIn exit: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestIndexNodeRebalanceIn (90.08s)
=== RUN   TestCreateReplicatedIndexesBeforeRebalance
2023/02/16 19:43:23 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:43:23 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: 1. Creating 5 indexes: non-partitioned, 2-REPLICA, non-deferred
2023/02/16 19:43:23 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_2REPLICAS__id_isActive on `default`(_id, isActive) with {"num_replica":2}
2023/02/16 19:43:31 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_2REPLICAS__id_isActive index is now active.
2023/02/16 19:43:31 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_2REPLICAS_docid_balance on `default`(docid, balance) with {"num_replica":2}
2023/02/16 19:43:41 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_2REPLICAS_docid_balance index is now active.
2023/02/16 19:43:41 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_2REPLICAS_guid_picture on `default`(guid, picture) with {"num_replica":2}
2023/02/16 19:43:51 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_2REPLICAS_guid_picture index is now active.
2023/02/16 19:43:51 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_2REPLICAS_isActive_age on `default`(isActive, age) with {"num_replica":2}
2023/02/16 19:44:05 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_2REPLICAS_isActive_age index is now active.
2023/02/16 19:44:05 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_2REPLICAS_balance_eyeColor on `default`(balance, eyeColor) with {"num_replica":2}
2023/02/16 19:44:17 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_2REPLICAS_balance_eyeColor index is now active.
2023/02/16 19:44:17 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: 2. Creating 2 indexes: 5-PARTITION, 1-REPLICA, non-deferred
2023/02/16 19:44:17 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_5PARTITIONS_1REPLICAS__id_balance on `default`(_id, balance) partition by hash(Meta().id) with {"num_partition":5, "num_replica":1}
2023/02/16 19:44:33 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_5PARTITIONS_1REPLICAS__id_balance index is now active.
2023/02/16 19:44:33 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance:: Executing N1QL: create index set14_idx_5PARTITIONS_1REPLICAS_docid_picture on `default`(docid, picture) partition by hash(Meta().id) with {"num_partition":5, "num_replica":1}
2023/02/16 19:44:44 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: set14_idx_5PARTITIONS_1REPLICAS_docid_picture index is now active.
2023/02/16 19:44:44 set14_rebalance_test.go::TestCreateReplicatedIndexesBeforeRebalance: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestCreateReplicatedIndexesBeforeRebalance (80.73s)
=== RUN   TestIndexNodeRebalanceOut
2023/02/16 19:44:44 set14_rebalance_test.go::TestIndexNodeRebalanceOut: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:44:44 set14_rebalance_test.go::TestIndexNodeRebalanceOut: 1. Rebalancing index node 127.0.0.1:9001 out of the cluster
2023/02/16 19:44:44 Removing node(s): [127.0.0.1:9001] from the cluster
2023/02/16 19:44:50 Rebalance progress: 6.25
2023/02/16 19:44:54 Rebalance progress: 12.5
2023/02/16 19:45:00 Rebalance progress: 38.97058823529412
2023/02/16 19:45:04 Rebalance progress: 52.20588235294118
2023/02/16 19:45:09 Rebalance progress: 61.02941176470588
2023/02/16 19:45:14 Rebalance progress: 69.85294117647058
2023/02/16 19:45:19 Rebalance progress: 69.85294117647058
2023/02/16 19:45:24 Rebalance progress: 83.08823529411765
2023/02/16 19:45:29 Rebalance progress: 100
2023/02/16 19:45:29 set14_rebalance_test.go::TestIndexNodeRebalanceOut: exit: Current cluster configuration: map[127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestIndexNodeRebalanceOut (47.46s)
=== RUN   TestFailoverAndRebalance
2023/02/16 19:45:31 set14_rebalance_test.go::TestFailoverAndRebalance: entry: Current cluster configuration: map[127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:45:31 set14_rebalance_test.go::TestFailoverAndRebalance: 1. Failing over index node 127.0.0.1:9002
2023/02/16 19:45:31 Failing over: [127.0.0.1:9002]
2023/02/16 19:45:33 set14_rebalance_test.go::TestFailoverAndRebalance: 2. Rebalancing
2023/02/16 19:45:39 Rebalance progress: 25
2023/02/16 19:45:44 Rebalance progress: 30
2023/02/16 19:45:48 Rebalance progress: 100
2023/02/16 19:45:49 set14_rebalance_test.go::TestFailoverAndRebalance: exit: Current cluster configuration: map[127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestFailoverAndRebalance (19.23s)
=== RUN   TestSwapRebalance
2023/02/16 19:45:51 set14_rebalance_test.go::TestSwapRebalance: entry: Current cluster configuration: map[127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:45:51 set14_rebalance_test.go::TestSwapRebalance: 1. Adding index node 127.0.0.1:9001 to the cluster
2023/02/16 19:45:51 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:45:58 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:45:58 set14_rebalance_test.go::TestSwapRebalance: 2. Swap rebalancing index node 127.0.0.1:9003 out of the cluster
2023/02/16 19:45:58 Removing node(s): [127.0.0.1:9003] from the cluster
2023/02/16 19:46:04 Rebalance progress: 16.66666666666667
2023/02/16 19:46:09 Rebalance progress: 38.09523809523809
2023/02/16 19:46:14 Rebalance progress: 45.23809523809524
2023/02/16 19:46:19 Rebalance progress: 45.23809523809524
2023/02/16 19:46:24 Rebalance progress: 52.38095238095238
2023/02/16 19:46:29 Rebalance progress: 59.52380952380953
2023/02/16 19:46:34 Rebalance progress: 66.66666666666667
2023/02/16 19:46:39 Rebalance progress: 73.80952380952381
2023/02/16 19:46:43 Rebalance progress: 73.80952380952381
2023/02/16 19:46:48 Rebalance progress: 83.33333333333333
2023/02/16 19:46:53 Rebalance progress: 100
2023/02/16 19:46:54 set14_rebalance_test.go::TestSwapRebalance: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestSwapRebalance (64.90s)
=== RUN   TestRebalanceReplicaRepair
2023/02/16 19:46:56 TestRebalanceReplicaRepair entry: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:46:56 TestRebalanceReplicaRepair: 1. Adding index node 127.0.0.1:9002 to the cluster
2023/02/16 19:46:56 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 19:47:06 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 19:47:06 TestRebalanceReplicaRepair: 2. Adding index node 127.0.0.1:9003 to the cluster
2023/02/16 19:47:06 Adding node: https://127.0.0.1:19003 with role: index to the cluster
2023/02/16 19:47:14 AddNode: Successfully added node: 127.0.0.1:9003 (role index), response: {"otpNode":"n_3@127.0.0.1"}
2023/02/16 19:47:14 TestRebalanceReplicaRepair: 3. Rebalancing
2023/02/16 19:47:20 Rebalance progress: 12.5
2023/02/16 19:47:25 Rebalance progress: 12.5
2023/02/16 19:47:30 Rebalance progress: 25.35714285714285
2023/02/16 19:47:35 Rebalance progress: 31.78571428571429
2023/02/16 19:47:40 Rebalance progress: 31.78571428571429
2023/02/16 19:47:45 Rebalance progress: 31.78571428571429
2023/02/16 19:47:50 Rebalance progress: 36.07142857142857
2023/02/16 19:47:55 Rebalance progress: 44.64285714285714
2023/02/16 19:48:00 Rebalance progress: 44.64285714285714
2023/02/16 19:48:05 Rebalance progress: 48.92857142857142
2023/02/16 19:48:10 Rebalance progress: 53.21428571428572
2023/02/16 19:48:15 Rebalance progress: 55.35714285714286
2023/02/16 19:48:20 Rebalance progress: 57.49999999999999
2023/02/16 19:48:25 Rebalance progress: 61.78571428571428
2023/02/16 19:48:30 Rebalance progress: 66.07142857142857
2023/02/16 19:48:35 Rebalance progress: 70.35714285714285
2023/02/16 19:48:40 Rebalance progress: 74.64285714285714
2023/02/16 19:48:45 Rebalance progress: 74.64285714285714
2023/02/16 19:48:50 Rebalance progress: 78.92857142857143
2023/02/16 19:48:55 Rebalance progress: 87.5
2023/02/16 19:49:00 Rebalance progress: 100
2023/02/16 19:49:00 TestRebalanceReplicaRepair exit: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestRebalanceReplicaRepair (126.10s)
=== RUN   TestFailureAndRebalanceDuringInitialIndexBuild
2023/02/16 19:49:02 set14_rebalance_test.go::TestFailureAndRebalanceDuringInitialIndexBuild: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:49:07 Created the secondary index index_0. Waiting for it become active
2023/02/16 19:49:07 Index is 7324187763956089422 now active
2023/02/16 19:49:14 Created the secondary index index_1. Waiting for it become active
2023/02/16 19:49:14 Index is 11424209368782207345 now active
2023/02/16 19:49:24 Created the secondary index index_2. Waiting for it become active
2023/02/16 19:49:24 Index is 5794289881840017325 now active
2023/02/16 19:49:31 Created the secondary index index_3. Waiting for it become active
2023/02/16 19:49:31 Index is 2591488318318466884 now active
2023/02/16 19:49:37 Created the secondary index index_4. Waiting for it become active
2023/02/16 19:49:37 Index is 11555918579901073676 now active
2023/02/16 19:49:43 Created the secondary index index_5. Waiting for it become active
2023/02/16 19:49:43 Index is 12343886278208295070 now active
2023/02/16 19:49:50 Created the secondary index index_6. Waiting for it become active
2023/02/16 19:49:50 Index is 1409773838723572789 now active
2023/02/16 19:49:56 Created the secondary index index_7. Waiting for it become active
2023/02/16 19:49:56 Index is 1673539919945201893 now active
2023/02/16 19:50:02 Created the secondary index index_8. Waiting for it become active
2023/02/16 19:50:02 Index is 11920958699239080452 now active
2023/02/16 19:50:11 Created the secondary index index_9. Waiting for it become active
2023/02/16 19:50:11 Index is 5401300983854857427 now active
2023/02/16 19:50:49 Failing over: [127.0.0.1:9002]
2023/02/16 19:50:51 Build the deferred index index_11. Waiting for the index to become active
2023/02/16 19:50:51 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:52 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:52 TestFailureAndRebalanceDuringInitialIndexBuild: 1. Adding index node 127.0.0.1:9002 to the cluster
2023/02/16 19:50:52 Kicking off failover recovery, type: full
2023/02/16 19:50:53 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:54 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:55 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:56 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:57 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:58 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:59 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:50:59 Rebalance progress: 12.5
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023-02-16T19:51:01.739+05:30 [Fatal] indexer node d7a71b162079e58fd15efbd013bcccea not available
2023/02/16 19:51:01 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:02 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:03 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:04 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:04 Rebalance progress: 12.5
2023/02/16 19:51:05 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:06 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:07 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:08 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:09 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:10 Rebalance progress: 20
2023/02/16 19:51:10 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:11 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:12 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:13 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:14 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:14 Rebalance failed. See logs for detailed reason. You can try again.
2023/02/16 19:51:14 set14_rebalance_test.go::TestFailureAndRebalanceDuringInitialIndexBuild: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestFailureAndRebalanceDuringInitialIndexBuild (132.80s)
=== RUN   TestRedistributeWhenNodeIsAddedForFalse
2023/02/16 19:51:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse::Setup entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 127.0.0.1:9003:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:51:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse::Setup 1. Setting up initial cluster configuration
2023/02/16 19:51:15 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 19:51:15 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:16 Waiting for index 8559491612868964881 to go active ...
2023/02/16 19:51:17 Index is 8559491612868964881 now active
2023/02/16 19:51:20 Rebalance progress: 6.25
2023/02/16 19:51:25 Rebalance progress: 87.5
2023/02/16 19:51:30 Rebalance progress: 100
2023/02/16 19:51:30 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:51:49 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:51:54 Rebalance progress: 100
2023/02/16 19:51:54 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse::Setup 2. Changing indexer.settings.rebalance.redistribute_indexes to false
2023/02/16 19:51:54 Changing config key indexer.settings.rebalance.redistribute_indexes to value false
2023/02/16 19:51:54 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse::Setup 3. Setup is completed
2023/02/16 19:51:54 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse::Setup exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:51:54 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse 1. Creating 10 indexes: non-partitioned, 0-REPLICA, non-deferred
2023/02/16 19:51:54 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS__id_isActive on `default`(_id, isActive)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:51:58 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS__id_isActive index is now active.
2023/02/16 19:51:58 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_docid_balance on `default`(docid, balance)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:07 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_docid_balance index is now active.
2023/02/16 19:52:07 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_guid_picture on `default`(guid, picture)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_guid_picture index is now active.
2023/02/16 19:52:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_isActive_age on `default`(isActive, age)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:22 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_isActive_age index is now active.
2023/02/16 19:52:22 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_balance_eyeColor on `default`(balance, eyeColor)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:30 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_balance_eyeColor index is now active.
2023/02/16 19:52:30 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_picture_name on `default`(picture, name)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:37 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_picture_name index is now active.
2023/02/16 19:52:37 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_age_gender on `default`(age, gender)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:45 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_age_gender index is now active.
2023/02/16 19:52:45 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_eyeColor_company on `default`(eyeColor, company)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:52:53 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_eyeColor_company index is now active.
2023/02/16 19:52:53 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_name_email on `default`(name, email)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:00 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_name_email index is now active.
2023/02/16 19:53:00 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_0REPLICAS_gender_phone on `default`(gender, phone)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:08 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_0REPLICAS_gender_phone index is now active.
2023/02/16 19:53:08 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse 2. Creating 4 indexes: 5-PARTITION, 0-REPLICA, non-deferred
2023/02/16 19:53:08 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS__id_balance on `default`(_id, balance) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:16 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_5PARTITIONS_0REPLICAS__id_balance index is now active.
2023/02/16 19:53:16 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_docid_picture on `default`(docid, picture) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:24 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_5PARTITIONS_0REPLICAS_docid_picture index is now active.
2023/02/16 19:53:24 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_guid_age on `default`(guid, age) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:32 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_5PARTITIONS_0REPLICAS_guid_age index is now active.
2023/02/16 19:53:32 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_isActive_eyeColor on `default`(isActive, eyeColor) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:53:40 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse set14_idx_5PARTITIONS_0REPLICAS_isActive_eyeColor index is now active.
2023/02/16 19:53:40 for GetIndexLocalMetadata http://127.0.0.1:9108/getLocalIndexMetadata
2023/02/16 19:53:40 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForFalse: Adding node 127.0.0.1:9002 with service index to the cluster
2023/02/16 19:53:40 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 19:53:49 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 19:53:55 Rebalance progress: 16.66666666666667
2023/02/16 19:54:00 Rebalance progress: 100
2023/02/16 19:54:00 for GetIndexLocalMetadata http://127.0.0.1:9108/getLocalIndexMetadata
2023/02/16 19:54:00 for GetIndexLocalMetadata http://127.0.0.1:9114/getLocalIndexMetadata
--- PASS: TestRedistributeWhenNodeIsAddedForFalse (165.55s)
=== RUN   TestRedistributeWhenNodeIsAddedForTrue
2023/02/16 19:54:00 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue::Setup entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:54:00 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue::Setup 1. Setting up initial cluster configuration
2023/02/16 19:54:00 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 19:54:05 Rebalance progress: 83.33333333333333
2023/02/16 19:54:10 Rebalance progress: 100
2023/02/16 19:54:10 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:54:22 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:54:27 Rebalance progress: 100
2023/02/16 19:54:27 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue::Setup 2. Changing indexer.settings.rebalance.redistribute_indexes to true
2023/02/16 19:54:28 Changing config key indexer.settings.rebalance.redistribute_indexes to value true
2023/02/16 19:54:28 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue::Setup 3. Setup is completed
2023/02/16 19:54:28 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue::Setup exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:54:28 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue 1. Creating 10 indexes: non-partitioned, 0-REPLICA, non-deferred
2023/02/16 19:54:28 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS__id_isActive on `default`(_id, isActive)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:54:32 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS__id_isActive index is now active.
2023/02/16 19:54:32 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_docid_balance on `default`(docid, balance)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:54:40 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_docid_balance index is now active.
2023/02/16 19:54:40 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_guid_picture on `default`(guid, picture)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:54:48 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_guid_picture index is now active.
2023/02/16 19:54:48 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_isActive_age on `default`(isActive, age)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:54:55 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_isActive_age index is now active.
2023/02/16 19:54:55 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_balance_eyeColor on `default`(balance, eyeColor)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:03 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_balance_eyeColor index is now active.
2023/02/16 19:55:03 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_picture_name on `default`(picture, name)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:10 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_picture_name index is now active.
2023/02/16 19:55:10 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_age_gender on `default`(age, gender)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:19 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_age_gender index is now active.
2023/02/16 19:55:19 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_eyeColor_company on `default`(eyeColor, company)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:27 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_eyeColor_company index is now active.
2023/02/16 19:55:27 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_name_email on `default`(name, email)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:35 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_name_email index is now active.
2023/02/16 19:55:35 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_0REPLICAS_gender_phone on `default`(gender, phone)  with { "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:43 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_0REPLICAS_gender_phone index is now active.
2023/02/16 19:55:43 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue 2. Creating 4 indexes: 5-PARTITION, 0-REPLICA, non-deferred
2023/02/16 19:55:43 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS__id_balance on `default`(_id, balance) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:51 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_5PARTITIONS_0REPLICAS__id_balance index is now active.
2023/02/16 19:55:51 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_docid_picture on `default`(docid, picture) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:55:59 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_5PARTITIONS_0REPLICAS_docid_picture index is now active.
2023/02/16 19:55:59 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_guid_age on `default`(guid, age) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:56:07 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_5PARTITIONS_0REPLICAS_guid_age index is now active.
2023/02/16 19:56:07 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Executing N1QL: create index set14_idx_5PARTITIONS_0REPLICAS_isActive_eyeColor on `default`(isActive, eyeColor) partition by hash(Meta().id) with {"num_partition":5, "nodes":["127.0.0.1:9001"]}
2023/02/16 19:56:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue set14_idx_5PARTITIONS_0REPLICAS_isActive_eyeColor index is now active.
2023/02/16 19:56:15 for GetIndexLocalMetadata http://127.0.0.1:9108/getLocalIndexMetadata
2023/02/16 19:56:15 set14_rebalance_test.go::TestRedistributeWhenNodeIsAddedForTrue: Adding node 127.0.0.1:9002 with service index to the cluster
2023/02/16 19:56:15 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 19:56:25 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 19:56:30 Rebalance progress: 16.66666666666667
2023/02/16 19:56:35 Rebalance progress: 23.33333333333333
2023/02/16 19:56:40 Rebalance progress: 41.66666666666666
2023/02/16 19:56:45 Rebalance progress: 41.66666666666666
2023/02/16 19:56:50 Rebalance progress: 66.66666666666667
2023/02/16 19:56:55 Rebalance progress: 66.66666666666667
2023/02/16 19:57:00 Rebalance progress: 100
2023/02/16 19:57:00 for GetIndexLocalMetadata http://127.0.0.1:9108/getLocalIndexMetadata
2023/02/16 19:57:00 for GetIndexLocalMetadata http://127.0.0.1:9114/getLocalIndexMetadata
--- PASS: TestRedistributeWhenNodeIsAddedForTrue (179.97s)
=== RUN   TestRebalanceResetCluster
2023/02/16 19:57:00 set14_rebalance_test.go::TestRebalanceResetCluster: entry: Current cluster configuration: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 19:57:00 set14_rebalance_test.go::TestRebalanceResetCluster: 1. Restoring indexer.settings.rebalance.redistribute_indexes to false
2023/02/16 19:57:00 Changing config key indexer.settings.rebalance.redistribute_indexes to value false
2023/02/16 19:57:00 set14_rebalance_test.go::TestRebalanceResetCluster: 2. Resetting cluster to initial configuration
2023/02/16 19:57:00 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 19:57:06 Rebalance progress: 83.33333333333333
2023/02/16 19:57:10 Rebalance progress: 100
2023/02/16 19:57:10 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:57:24 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:57:29 Rebalance progress: 100
2023/02/16 19:57:29 set14_rebalance_test.go::TestRebalanceResetCluster: exit: Current cluster configuration: map[127.0.0.1:9001:[index] 172.31.5.112:9000:[kv n1ql]]
--- PASS: TestRebalanceResetCluster (31.14s)
=== RUN   TestAlterIndexIncrReplica
2023/02/16 19:57:31 In TestAlterIndexIncrReplica()
2023/02/16 19:57:31 This test creates an index with one replica and then increments replica count to 2
2023/02/16 19:57:31 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 19:57:38 Rebalance progress: 100
2023/02/16 19:57:38 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 19:57:50 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 19:57:55 Rebalance progress: 100
2023/02/16 19:57:55 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 19:58:03 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 19:58:08 Rebalance progress: 100
2023/02/16 19:58:08 Adding node: https://127.0.0.1:19003 with role: index to the cluster
2023/02/16 19:58:17 AddNode: Successfully added node: 127.0.0.1:9003 (role index), response: {"otpNode":"n_3@127.0.0.1"}
2023/02/16 19:58:22 Rebalance progress: 87.5
2023/02/16 19:58:27 Rebalance progress: 100
2023/02/16 19:58:27 In DropAllSecondaryIndexes()
2023/02/16 19:59:12 Flushed the bucket default, Response body: 
2023/02/16 19:59:18 Created the secondary index idx_1. Waiting for it become active
2023/02/16 19:59:18 Index is 3759082810745622823 now active
2023/02/16 19:59:18 Executing alter index command: alter index `default`.idx_1 with {"action":"replica_count", "num_replica":2}
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:46 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
2023/02/16 19:59:47 Using n1ql client
--- PASS: TestAlterIndexIncrReplica (136.16s)
=== RUN   TestAlterIndexDecrReplica
2023/02/16 19:59:47 In TestAlterIndexDecrReplica()
2023/02/16 19:59:47 This test creates an index with two replicas and then decrements replica count to 1
2023/02/16 19:59:47 In DropAllSecondaryIndexes()
2023/02/16 19:59:47 Index found:  idx_1
2023/02/16 19:59:48 Dropped index idx_1
2023/02/16 20:00:08 Created the secondary index idx_2. Waiting for it become active
2023/02/16 20:00:08 Index is 15586698494549435956 now active
2023/02/16 20:00:08 Executing alter index command: alter index `default`.idx_2 with {"action":"replica_count", "num_replica":1}
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
2023/02/16 20:00:34 Using n1ql client
--- PASS: TestAlterIndexDecrReplica (46.76s)
=== RUN   TestAlterIndexDropReplica
2023/02/16 20:00:34 In TestAlterIndexDropReplica()
2023/02/16 20:00:34 This test creates an index with two replicas and then drops one replica from cluster
2023/02/16 20:00:34 In DropAllSecondaryIndexes()
2023/02/16 20:00:34 Index found:  idx_2
2023/02/16 20:00:34 Dropped index idx_2
2023/02/16 20:00:51 Created the secondary index idx_3. Waiting for it become active
2023/02/16 20:00:51 Index is 7479617527174730785 now active
2023/02/16 20:00:51 Executing alter index command: alter index `default`.idx_3 with {"action":"drop_replica", "replicaId":0}
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
2023/02/16 20:01:17 Using n1ql client
--- PASS: TestAlterIndexDropReplica (43.26s)
=== RUN   TestResetCluster_1
2023/02/16 20:01:17 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 20:01:23 Rebalance progress: 87.5
2023/02/16 20:01:28 Rebalance progress: 100
2023/02/16 20:01:28 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 20:01:46 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 20:01:51 Rebalance progress: 100
--- PASS: TestResetCluster_1 (34.20s)
=== RUN   TestPartitionDistributionWithReplica
2023/02/16 20:01:51 In TestPartitionDistributionWithReplica()
2023/02/16 20:01:51 This test will create a paritioned index with replica and checks the parition distribution
2023/02/16 20:01:51 Parititions with same ID beloning to both replica and source index should not be on the same node
2023/02/16 20:01:52 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023-02-16T20:01:55.274+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:01:55.304+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:01:57 Rebalance progress: 100
2023/02/16 20:01:57 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 20:02:10 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 20:02:16 Rebalance progress: 100
2023/02/16 20:02:16 Adding node: https://127.0.0.1:19002 with role: index to the cluster
2023/02/16 20:02:24 AddNode: Successfully added node: 127.0.0.1:9002 (role index), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 20:02:30 Rebalance progress: 100
2023/02/16 20:02:30 Adding node: https://127.0.0.1:19003 with role: index to the cluster
2023/02/16 20:02:37 AddNode: Successfully added node: 127.0.0.1:9003 (role index), response: {"otpNode":"n_3@127.0.0.1"}
2023/02/16 20:02:42 Rebalance progress: 12.5
2023/02/16 20:02:47 Rebalance progress: 100
2023/02/16 20:02:47 In DropAllSecondaryIndexes()
2023/02/16 20:03:33 Flushed the bucket default, Response body: 
2023/02/16 20:03:33 Executing create partition index command on: create index `idx_partn` on `default`(age) partition by hash(meta().id) with {"num_partition":8, "num_replica":1}
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
2023/02/16 20:04:01 Using n1ql client
--- PASS: TestPartitionDistributionWithReplica (130.16s)
=== RUN   TestPartitionedPartialIndex
2023/02/16 20:04:02 Executing create index command: CREATE INDEX `idx_regular` ON `default`(partn_name)
2023/02/16 20:04:06 Using n1ql client
2023/02/16 20:04:06 Dropping the secondary index idx_regular
2023/02/16 20:04:07 Index dropped
2023/02/16 20:04:07 Executing create index command: CREATE INDEX `idx_partial` ON `default`(partn_name) WHERE partn_age >= 0
2023/02/16 20:04:13 Using n1ql client
2023/02/16 20:04:14 Using n1ql client
2023/02/16 20:04:14 Using n1ql client
2023/02/16 20:04:14 Dropping the secondary index idx_partial
2023/02/16 20:04:14 Index dropped
2023/02/16 20:04:15 Executing create index command: CREATE INDEX `idx_partitioned` ON `default`(partn_name) PARTITION BY HASH(meta().id) 
2023/02/16 20:04:28 Using n1ql client
2023/02/16 20:04:28 Using n1ql client
2023/02/16 20:04:28 Dropping the secondary index idx_partitioned
2023/02/16 20:04:28 Index dropped
2023/02/16 20:04:28 Executing create index command: CREATE INDEX `idx_partitioned_partial` ON `default`(partn_name) PARTITION BY HASH(meta().id) WHERE partn_age >= 0
2023/02/16 20:04:41 Using n1ql client
2023/02/16 20:04:42 Using n1ql client
2023/02/16 20:04:42 Using n1ql client
2023/02/16 20:04:42 Using n1ql client
2023/02/16 20:04:42 Dropping the secondary index idx_partitioned_partial
2023/02/16 20:04:43 Index dropped
--- PASS: TestPartitionedPartialIndex (53.41s)
=== RUN   TestResetCluster_2
2023/02/16 20:04:55 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002 127.0.0.1:9003] from the cluster
2023/02/16 20:05:01 Rebalance progress: 87.5
2023/02/16 20:05:05 Rebalance progress: 100
2023/02/16 20:05:05 Adding node: https://127.0.0.1:19001 with role: index to the cluster
2023/02/16 20:05:24 AddNode: Successfully added node: 127.0.0.1:9001 (role index), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 20:05:29 Rebalance progress: 100
--- PASS: TestResetCluster_2 (34.22s)
=== RUN   TestCollectionSetup
2023/02/16 20:05:29 In TestCollectionSetup()
2023/02/16 20:05:29 In DropAllSecondaryIndexes()
2023/02/16 20:06:24 Flushed the bucket default, Response body: 
--- PASS: TestCollectionSetup (56.56s)
=== RUN   TestCollectionDefault
2023/02/16 20:06:26 In TestCollectionDefault()
2023/02/16 20:06:37 Created the secondary index _default__default_i1. Waiting for it become active
2023/02/16 20:06:37 Index is 7544684882186738050 now active
2023/02/16 20:06:37 Using n1ql client
2023/02/16 20:06:38 Expected and Actual scan responses are the same
2023/02/16 20:06:44 Created the secondary index _default__default_i2. Waiting for it become active
2023/02/16 20:06:44 Index is 8731585954416731352 now active
2023/02/16 20:06:44 Using n1ql client
2023/02/16 20:06:45 Expected and Actual scan responses are the same
2023/02/16 20:06:45 Dropping the secondary index _default__default_i1
2023/02/16 20:06:45 Index dropped
2023/02/16 20:06:45 Using n1ql client
2023/02/16 20:06:45 Expected and Actual scan responses are the same
2023/02/16 20:06:50 Created the secondary index _default__default_i1. Waiting for it become active
2023/02/16 20:06:50 Index is 10861411433065542996 now active
2023/02/16 20:06:50 Using n1ql client
2023/02/16 20:06:50 Expected and Actual scan responses are the same
2023/02/16 20:06:50 Dropping the secondary index _default__default_i1
2023/02/16 20:06:50 Index dropped
2023/02/16 20:06:50 Dropping the secondary index _default__default_i2
2023/02/16 20:06:50 Index dropped
2023/02/16 20:06:55 Created the secondary index _default__default_i1. Waiting for it become active
2023/02/16 20:06:55 Index is 10412078327335482489 now active
2023/02/16 20:06:55 Using n1ql client
2023/02/16 20:06:56 Expected and Actual scan responses are the same
2023/02/16 20:07:01 Created the secondary index _default__default_i2. Waiting for it become active
2023/02/16 20:07:01 Index is 15480751878664132430 now active
2023/02/16 20:07:01 Using n1ql client
2023/02/16 20:07:01 Expected and Actual scan responses are the same
2023/02/16 20:07:01 Dropping the secondary index _default__default_i1
2023/02/16 20:07:02 Index dropped
2023/02/16 20:07:02 Dropping the secondary index _default__default_i2
2023/02/16 20:07:02 Index dropped
2023/02/16 20:07:02 Build command issued for the deferred indexes [_default__default_i1 _default__default_i2], bucket: default, scope: _default, coll: _default
2023/02/16 20:07:02 Waiting for the index _default__default_i1 to become active
2023/02/16 20:07:02 Waiting for index 11685850064285505035 to go active ...
2023/02/16 20:07:03 Waiting for index 11685850064285505035 to go active ...
2023/02/16 20:07:04 Waiting for index 11685850064285505035 to go active ...
2023/02/16 20:07:05 Waiting for index 11685850064285505035 to go active ...
2023/02/16 20:07:06 Waiting for index 11685850064285505035 to go active ...
2023/02/16 20:07:07 Index is 11685850064285505035 now active
2023/02/16 20:07:07 Waiting for the index _default__default_i2 to become active
2023/02/16 20:07:07 Index is 9432837333749852263 now active
2023/02/16 20:07:14 Using n1ql client
2023/02/16 20:07:14 Expected and Actual scan responses are the same
2023/02/16 20:07:14 Using n1ql client
2023/02/16 20:07:14 Expected and Actual scan responses are the same
2023/02/16 20:07:14 Dropping the secondary index _default__default_i1
2023/02/16 20:07:14 Index dropped
2023/02/16 20:07:20 Using n1ql client
2023/02/16 20:07:20 Expected and Actual scan responses are the same
2023/02/16 20:07:20 Dropping the secondary index _default__default_i2
2023/02/16 20:07:20 Index dropped
--- PASS: TestCollectionDefault (54.38s)
=== RUN   TestCollectionNonDefault
2023/02/16 20:07:20 In TestCollectionNonDefault()
2023/02/16 20:07:20 Creating scope: s1 for bucket: default as it does not exist
2023/02/16 20:07:20 Create scope succeeded for bucket default, scopeName: s1, body: {"uid":"1"} 
2023/02/16 20:07:21 Created collection succeeded for bucket: default, scope: s1, collection: c1, body: {"uid":"2"}
2023/02/16 20:07:36 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:07:36 Index is 14654433815072220577 now active
2023/02/16 20:07:36 Using n1ql client
2023-02-16T20:07:37.000+05:30 [Info] metadata provider version changed 3510 -> 3511
2023-02-16T20:07:37.000+05:30 [Info] switched currmeta from 3510 -> 3511 force false 
2023-02-16T20:07:37.000+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:07:37.001+05:30 [Info] GSIC[default/default-s1-c1-1676558256988660808] started ...
2023/02/16 20:07:37 Expected and Actual scan responses are the same
2023/02/16 20:07:44 Created the secondary index s1_c1_i2. Waiting for it become active
2023/02/16 20:07:44 Index is 7910860959580795711 now active
2023/02/16 20:07:44 Using n1ql client
2023/02/16 20:07:44 Expected and Actual scan responses are the same
2023/02/16 20:07:44 Dropping the secondary index s1_c1_i1
2023/02/16 20:07:44 Index dropped
2023/02/16 20:07:44 Using n1ql client
2023/02/16 20:07:44 Expected and Actual scan responses are the same
2023/02/16 20:07:50 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:07:50 Index is 7275477979876140115 now active
2023/02/16 20:07:50 Using n1ql client
2023/02/16 20:07:51 Expected and Actual scan responses are the same
2023/02/16 20:07:51 Dropping the secondary index s1_c1_i1
2023/02/16 20:07:51 Index dropped
2023/02/16 20:07:51 Dropping the secondary index s1_c1_i2
2023/02/16 20:07:52 Index dropped
2023/02/16 20:07:56 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:07:56 Index is 7341632007436121196 now active
2023/02/16 20:07:56 Using n1ql client
2023/02/16 20:07:56 Expected and Actual scan responses are the same
2023/02/16 20:08:04 Created the secondary index s1_c1_i2. Waiting for it become active
2023/02/16 20:08:04 Index is 2268316514057279722 now active
2023/02/16 20:08:04 Using n1ql client
2023/02/16 20:08:04 Expected and Actual scan responses are the same
2023/02/16 20:08:04 Dropping the secondary index s1_c1_i1
2023/02/16 20:08:04 Index dropped
2023/02/16 20:08:04 Dropping the secondary index s1_c1_i2
2023/02/16 20:08:04 Index dropped
2023/02/16 20:08:04 Build command issued for the deferred indexes [s1_c1_i1 s1_c1_i2], bucket: default, scope: s1, coll: c1
2023/02/16 20:08:04 Waiting for the index s1_c1_i1 to become active
2023/02/16 20:08:04 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:05 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:06 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:07 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:08 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:09 Waiting for index 14314799180607868997 to go active ...
2023/02/16 20:08:10 Index is 14314799180607868997 now active
2023/02/16 20:08:10 Waiting for the index s1_c1_i2 to become active
2023/02/16 20:08:10 Index is 3534164742854890559 now active
2023/02/16 20:08:11 Using n1ql client
2023/02/16 20:08:14 Expected and Actual scan responses are the same
2023/02/16 20:08:14 Using n1ql client
2023/02/16 20:08:14 Expected and Actual scan responses are the same
2023/02/16 20:08:14 Dropping the secondary index s1_c1_i1
2023/02/16 20:08:14 Index dropped
2023/02/16 20:08:15 Using n1ql client
2023/02/16 20:08:15 Expected and Actual scan responses are the same
2023/02/16 20:08:15 Dropping the secondary index s1_c1_i2
2023/02/16 20:08:16 Index dropped
--- PASS: TestCollectionNonDefault (55.31s)
=== RUN   TestCollectionMetaAtSnapEnd
2023/02/16 20:08:16 In TestCollectionMetaAtSnapEnd()
2023/02/16 20:08:16 Creating scope: s2 for bucket: default as it does not exist
2023/02/16 20:08:16 Create scope succeeded for bucket default, scopeName: s2, body: {"uid":"3"} 
2023/02/16 20:08:16 Created collection succeeded for bucket: default, scope: s2, collection: c2, body: {"uid":"4"}
2023/02/16 20:08:30 Created the secondary index s2_c2_i1. Waiting for it become active
2023/02/16 20:08:30 Index is 8460404073290374495 now active
2023/02/16 20:08:30 Using n1ql client
2023-02-16T20:08:30.466+05:30 [Info] metadata provider version changed 3569 -> 3570
2023-02-16T20:08:30.467+05:30 [Info] switched currmeta from 3569 -> 3570 force false 
2023-02-16T20:08:30.467+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:08:30.467+05:30 [Info] GSIC[default/default-s2-c2-1676558310456687144] started ...
2023/02/16 20:08:30 Expected and Actual scan responses are the same
2023/02/16 20:08:35 Using n1ql client
2023/02/16 20:08:37 Expected and Actual scan responses are the same
2023/02/16 20:08:37 Created collection succeeded for bucket: default, scope: s2, collection: c3, body: {"uid":"5"}
2023/02/16 20:08:47 Using n1ql client
2023/02/16 20:08:47 Expected and Actual scan responses are the same
2023/02/16 20:08:52 Created the secondary index s2_c2_i2. Waiting for it become active
2023/02/16 20:08:52 Index is 5479440355332240242 now active
2023/02/16 20:08:52 Using n1ql client
2023/02/16 20:08:52 Expected and Actual scan responses are the same
2023/02/16 20:08:52 Using n1ql client
2023/02/16 20:08:52 Expected and Actual scan responses are the same
2023/02/16 20:08:58 Using n1ql client
2023/02/16 20:08:58 Expected and Actual scan responses are the same
2023/02/16 20:08:58 Using n1ql client
2023/02/16 20:08:58 Expected and Actual scan responses are the same
--- PASS: TestCollectionMetaAtSnapEnd (42.60s)
=== RUN   TestCollectionUpdateSeq
2023/02/16 20:08:58 In TestCollectionUpdateSeq()
2023/02/16 20:08:59 Using n1ql client
2023/02/16 20:08:59 Expected and Actual scan responses are the same
2023/02/16 20:08:59 Using n1ql client
2023/02/16 20:08:59 Expected and Actual scan responses are the same
2023/02/16 20:09:06 Using n1ql client
2023/02/16 20:09:06 Expected and Actual scan responses are the same
2023/02/16 20:09:06 Using n1ql client
2023/02/16 20:09:06 Expected and Actual scan responses are the same
2023/02/16 20:09:06 Dropping the secondary index s2_c2_i1
2023/02/16 20:09:06 Index dropped
2023/02/16 20:09:06 Dropping the secondary index s2_c2_i2
2023/02/16 20:09:06 Index dropped
--- PASS: TestCollectionUpdateSeq (7.61s)
=== RUN   TestCollectionMultiple
2023/02/16 20:09:06 In TestCollectionMultiple()
2023/02/16 20:09:10 Created the secondary index _default__default_i3. Waiting for it become active
2023/02/16 20:09:10 Index is 6119611010788492830 now active
2023/02/16 20:09:10 Using n1ql client
2023/02/16 20:09:10 Expected and Actual scan responses are the same
2023/02/16 20:09:17 Created the secondary index s1_c1_i4. Waiting for it become active
2023/02/16 20:09:17 Index is 11321241173413650613 now active
2023/02/16 20:09:17 Using n1ql client
2023/02/16 20:09:17 Expected and Actual scan responses are the same
2023/02/16 20:09:17 Dropping the secondary index _default__default_i3
2023/02/16 20:09:17 Index dropped
2023/02/16 20:09:17 Dropping the secondary index s1_c1_i4
2023/02/16 20:09:18 Index dropped
--- PASS: TestCollectionMultiple (11.79s)
=== RUN   TestCollectionNoDocs
2023/02/16 20:09:18 In TestCollectionNoDocs()
2023/02/16 20:09:23 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:09:23 Index is 344328217685085387 now active
2023/02/16 20:09:23 Using n1ql client
2023/02/16 20:09:23 Expected and Actual scan responses are the same
2023/02/16 20:09:31 Created the secondary index s1_c1_i2. Waiting for it become active
2023/02/16 20:09:31 Index is 8238281779764057219 now active
2023/02/16 20:09:31 Using n1ql client
2023/02/16 20:09:31 Expected and Actual scan responses are the same
2023/02/16 20:09:31 Dropping the secondary index s1_c1_i1
2023/02/16 20:09:31 Index dropped
2023/02/16 20:09:31 Using n1ql client
2023/02/16 20:09:31 Expected and Actual scan responses are the same
2023/02/16 20:09:38 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:09:38 Index is 14750153267170146474 now active
2023/02/16 20:09:38 Using n1ql client
2023/02/16 20:09:39 Expected and Actual scan responses are the same
2023/02/16 20:09:45 Using n1ql client
2023/02/16 20:09:45 Expected and Actual scan responses are the same
2023/02/16 20:09:45 Using n1ql client
2023/02/16 20:09:45 Expected and Actual scan responses are the same
2023/02/16 20:09:45 Dropping the secondary index s1_c1_i1
2023/02/16 20:09:45 Index dropped
2023/02/16 20:09:45 Dropping the secondary index s1_c1_i2
2023/02/16 20:09:45 Index dropped
--- PASS: TestCollectionNoDocs (27.35s)
=== RUN   TestCollectionPrimaryIndex
2023/02/16 20:09:45 In TestCollectionPrimaryIndex()
2023/02/16 20:09:51 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:09:51 Index is 11018486365075205704 now active
2023/02/16 20:09:51 Using n1ql client
2023/02/16 20:09:58 Created the secondary index s1_c1_i2. Waiting for it become active
2023/02/16 20:09:58 Index is 7468200485857327296 now active
2023/02/16 20:09:58 Using n1ql client
2023/02/16 20:09:59 Using n1ql client
2023/02/16 20:09:59 Using n1ql client
2023/02/16 20:09:59 Dropping the secondary index s1_c1_i1
2023/02/16 20:09:59 Index dropped
2023/02/16 20:10:06 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:10:06 Index is 16058753977036598908 now active
2023/02/16 20:10:06 Using n1ql client
2023/02/16 20:10:06 Expected and Actual scan responses are the same
2023/02/16 20:10:07 Using n1ql client
2023/02/16 20:10:07 Expected and Actual scan responses are the same
2023/02/16 20:10:07 Using n1ql client
2023/02/16 20:10:07 Dropping the secondary index s1_c1_i1
2023/02/16 20:10:07 Index dropped
2023/02/16 20:10:07 Dropping the secondary index s1_c1_i2
2023/02/16 20:10:07 Index dropped
--- PASS: TestCollectionPrimaryIndex (22.16s)
=== RUN   TestCollectionMultipleBuilds
2023/02/16 20:10:07 Build command issued for the deferred indexes [1487112004431281321 1796326251702371115]
2023/02/16 20:10:08 Build command issued for the deferred indexes [5495717888734183912 16518804027352058921]
2023/02/16 20:10:08 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:09 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:10 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:11 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:12 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:13 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:14 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:15 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:16 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:17 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:18 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:19 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:20 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:21 Waiting for index 1487112004431281321 to go active ...
2023/02/16 20:10:22 Index is 1487112004431281321 now active
2023/02/16 20:10:22 Index is 1796326251702371115 now active
2023/02/16 20:10:22 Index is 5495717888734183912 now active
2023/02/16 20:10:22 Index is 16518804027352058921 now active
2023/02/16 20:10:22 Using n1ql client
2023/02/16 20:10:22 Expected and Actual scan responses are the same
2023/02/16 20:10:22 Using n1ql client
2023/02/16 20:10:22 Expected and Actual scan responses are the same
2023/02/16 20:10:22 Using n1ql client
2023/02/16 20:10:22 Expected and Actual scan responses are the same
2023/02/16 20:10:22 Using n1ql client
2023/02/16 20:10:22 Expected and Actual scan responses are the same
2023/02/16 20:10:22 Dropping the secondary index s2_c2_i3
2023/02/16 20:10:22 Index dropped
2023/02/16 20:10:22 Dropping the secondary index s2_c2_i4
2023/02/16 20:10:22 Index dropped
2023/02/16 20:10:22 Dropping the secondary index s1_c1_i1
2023/02/16 20:10:22 Index dropped
2023/02/16 20:10:22 Dropping the secondary index s1_c1_i2
2023/02/16 20:10:22 Index dropped
--- PASS: TestCollectionMultipleBuilds (14.98s)
=== RUN   TestCollectionMultipleBuilds2
2023/02/16 20:10:23 Build command issued for the deferred indexes [1442480582826968537 7528789546596933861 11898123391954316275 15365131268852253262 16032617763564978270 2226410905122912095 6213285519229839343 5108069137248060561]
2023/02/16 20:10:23 Waiting for index 1442480582826968537 to go active ...
2023/02/16 20:10:24 Waiting for index 1442480582826968537 to go active ...
2023/02/16 20:10:25 Waiting for index 1442480582826968537 to go active ...
2023/02/16 20:10:26 Waiting for index 1442480582826968537 to go active ...
2023/02/16 20:10:27 Waiting for index 1442480582826968537 to go active ...
2023/02/16 20:10:28 Index is 1442480582826968537 now active
2023/02/16 20:10:28 Index is 7528789546596933861 now active
2023/02/16 20:10:28 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:29 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:30 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:31 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:32 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:33 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:34 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:35 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:36 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:37 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:38 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:39 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:40 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:41 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:42 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:43 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:44 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:45 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:46 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:47 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:48 Waiting for index 11898123391954316275 to go active ...
2023/02/16 20:10:49 Index is 11898123391954316275 now active
2023/02/16 20:10:49 Waiting for index 15365131268852253262 to go active ...
2023/02/16 20:10:50 Index is 15365131268852253262 now active
2023/02/16 20:10:50 Index is 16032617763564978270 now active
2023/02/16 20:10:50 Index is 2226410905122912095 now active
2023/02/16 20:10:50 Index is 6213285519229839343 now active
2023/02/16 20:10:50 Index is 5108069137248060561 now active
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023-02-16T20:10:50.870+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:10:50.870+05:30 [Info] GSIC[default/default-s2-c3-1676558450867768346] started ...
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:50 Using n1ql client
2023/02/16 20:10:50 Expected and Actual scan responses are the same
2023/02/16 20:10:52 Using n1ql client
2023/02/16 20:10:52 Expected and Actual scan responses are the same
2023/02/16 20:10:52 Using n1ql client
2023/02/16 20:10:52 Expected and Actual scan responses are the same
2023/02/16 20:10:52 Dropping the secondary index s1_c1_i1
2023/02/16 20:10:52 Index dropped
2023/02/16 20:10:52 Dropping the secondary index s1_c1_i2
2023/02/16 20:10:52 Index dropped
2023/02/16 20:10:52 Dropping the secondary index s2_c2_i1
2023/02/16 20:10:52 Index dropped
2023/02/16 20:10:52 Dropping the secondary index s2_c2_i2
2023/02/16 20:10:52 Index dropped
2023/02/16 20:10:52 Dropping the secondary index s2_c3_i1
2023/02/16 20:10:52 Index dropped
2023/02/16 20:10:52 Dropping the secondary index s2_c3_i2
2023/02/16 20:10:53 Index dropped
2023/02/16 20:10:53 Dropping the secondary index _default__default_i1
2023/02/16 20:10:53 Index dropped
2023/02/16 20:10:53 Dropping the secondary index _default__default_i2
2023/02/16 20:10:53 Index dropped
--- PASS: TestCollectionMultipleBuilds2 (30.71s)
=== RUN   TestCollectionIndexDropConcurrentBuild
2023/02/16 20:10:53 In TestCollectionIndexDropConcurrentBuild()
2023/02/16 20:10:53 Build command issued for the deferred indexes [16732764474947828149 8024808602464944867]
2023/02/16 20:10:54 Dropping the secondary index s1_c1_i1
2023/02/16 20:10:54 Index dropped
2023/02/16 20:10:54 Waiting for index 8024808602464944867 to go active ...
2023/02/16 20:10:55 Waiting for index 8024808602464944867 to go active ...
2023/02/16 20:10:56 Waiting for index 8024808602464944867 to go active ...
2023/02/16 20:10:57 Waiting for index 8024808602464944867 to go active ...
2023/02/16 20:10:58 Index is 8024808602464944867 now active
2023/02/16 20:10:58 Using n1ql client
2023/02/16 20:10:58 Expected and Actual scan responses are the same
2023/02/16 20:10:58 Dropping the secondary index s1_c1_i2
2023/02/16 20:10:59 Index dropped
--- PASS: TestCollectionIndexDropConcurrentBuild (5.79s)
=== RUN   TestCollectionIndexDropConcurrentBuild2
2023/02/16 20:10:59 In TestCollectionIndexDropConcurrentBuild2()
2023/02/16 20:11:05 Created the secondary index s1_c1_i3. Waiting for it become active
2023/02/16 20:11:05 Index is 13409791916902601881 now active
2023/02/16 20:11:05 Using n1ql client
2023-02-16T20:11:05.498+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] Range(4688085527175897513) response failed `Index not found`
2023-02-16T20:11:05.502+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] Range(4688085527175897513) response failed `Index not found`
2023-02-16T20:11:05.514+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] Range(4688085527175897513) response failed `Index not found`
2023/02/16 20:11:05 Expected and Actual scan responses are the same
2023/02/16 20:11:05 Build command issued for the deferred indexes [9358019263512259111 10343115455612074435]
2023/02/16 20:11:06 Dropping the secondary index s1_c1_i3
2023/02/16 20:11:07 Index dropped
2023/02/16 20:11:07 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:08 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:09 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:10 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:11 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:12 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:13 Waiting for index 9358019263512259111 to go active ...
2023/02/16 20:11:14 Index is 9358019263512259111 now active
2023/02/16 20:11:14 Index is 10343115455612074435 now active
2023/02/16 20:11:14 Using n1ql client
2023/02/16 20:11:14 Expected and Actual scan responses are the same
2023/02/16 20:11:14 Using n1ql client
2023/02/16 20:11:14 Expected and Actual scan responses are the same
2023/02/16 20:11:14 Dropping the secondary index s1_c1_i1
2023/02/16 20:11:14 Index dropped
2023/02/16 20:11:14 Dropping the secondary index s1_c1_i2
2023/02/16 20:11:16 Index dropped
--- PASS: TestCollectionIndexDropConcurrentBuild2 (17.29s)
=== RUN   TestCollectionDrop
2023/02/16 20:11:16 In TestCollectionDrop()
2023/02/16 20:11:21 Created the secondary index s1_c1_i1. Waiting for it become active
2023/02/16 20:11:21 Index is 13768278659020924031 now active
2023/02/16 20:11:29 Created the secondary index s1_c1_i2. Waiting for it become active
2023/02/16 20:11:29 Index is 11342195526401419030 now active
2023/02/16 20:11:36 Created the secondary index s2_c2_i1. Waiting for it become active
2023/02/16 20:11:36 Index is 8532106094665278326 now active
2023/02/16 20:11:43 Created the secondary index s2_c2_i2. Waiting for it become active
2023/02/16 20:11:43 Index is 2004211206367387362 now active
2023/02/16 20:11:50 Created the secondary index s2_c3_i1. Waiting for it become active
2023/02/16 20:11:50 Index is 16127283678765771932 now active
2023/02/16 20:11:57 Created the secondary index s2_c3_i2. Waiting for it become active
2023/02/16 20:11:57 Index is 13507416726582154317 now active
2023/02/16 20:12:03 Created the secondary index _default__default_i1. Waiting for it become active
2023/02/16 20:12:03 Index is 17184084990040232463 now active
2023/02/16 20:12:10 Created the secondary index _default__default_i2. Waiting for it become active
2023/02/16 20:12:10 Index is 7282696138809867024 now active
2023/02/16 20:12:10 Dropped collection c1 for bucket: default, scope: s1, body: {"uid":"6"}
2023/02/16 20:12:15 Using n1ql client
2023/02/16 20:12:15 Scan failed as expected with error: Index Not Found - cause: GSI index s1_c1_i1 not found.
2023/02/16 20:12:15 Dropped scope s2 for bucket: default, body: {"uid":"7"}
2023/02/16 20:12:20 Using n1ql client
2023-02-16T20:12:20.812+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:12:20.812+05:30 [Info] GSIC[default/default-s2-c1-1676558540810245568] started ...
2023/02/16 20:12:20 Scan failed as expected with error: Index Not Found - cause: GSI index s2_c1_i1 not found.
--- PASS: TestCollectionDrop (64.48s)
=== RUN   TestCollectionDDLWithConcurrentSystemEvents
2023/02/16 20:12:20 Creating scope: sc for bucket: default as it does not exist
2023/02/16 20:12:20 Create scope succeeded for bucket default, scopeName: sc, body: {"uid":"8"} 
2023/02/16 20:12:21 Created collection succeeded for bucket: default, scope: sc, collection: cc, body: {"uid":"9"}
2023/02/16 20:12:36 Created the secondary index sc_cc_i1. Waiting for it become active
2023/02/16 20:12:36 Index is 1542787613060768901 now active
2023/02/16 20:12:38 Build command issued for the deferred indexes [sc_cc_i2], bucket: default, scope: sc, coll: cc
2023/02/16 20:12:38 Waiting for the index sc_cc_i2 to become active
2023/02/16 20:12:38 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:38 Created collection succeeded for bucket: default, scope: sc, collection: cc_0, body: {"uid":"a"}
2023/02/16 20:12:39 Created collection succeeded for bucket: default, scope: sc, collection: cc_1, body: {"uid":"b"}
2023/02/16 20:12:39 Created collection succeeded for bucket: default, scope: sc, collection: cc_2, body: {"uid":"c"}
2023/02/16 20:12:39 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:40 Created collection succeeded for bucket: default, scope: sc, collection: cc_3, body: {"uid":"d"}
2023/02/16 20:12:40 Created collection succeeded for bucket: default, scope: sc, collection: cc_4, body: {"uid":"e"}
2023/02/16 20:12:40 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:40 Created collection succeeded for bucket: default, scope: sc, collection: cc_5, body: {"uid":"f"}
2023/02/16 20:12:41 Created collection succeeded for bucket: default, scope: sc, collection: cc_6, body: {"uid":"10"}
2023/02/16 20:12:41 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:42 Created collection succeeded for bucket: default, scope: sc, collection: cc_7, body: {"uid":"11"}
2023/02/16 20:12:42 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:42 Created collection succeeded for bucket: default, scope: sc, collection: cc_8, body: {"uid":"12"}
2023/02/16 20:12:43 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:44 Created collection succeeded for bucket: default, scope: sc, collection: cc_9, body: {"uid":"13"}
2023/02/16 20:12:44 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:45 Waiting for index 15120199537398608892 to go active ...
2023/02/16 20:12:46 Index is 15120199537398608892 now active
2023/02/16 20:12:46 Using n1ql client
2023-02-16T20:12:46.678+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:12:46.678+05:30 [Info] GSIC[default/default-sc-cc-1676558566675712441] started ...
--- PASS: TestCollectionDDLWithConcurrentSystemEvents (25.88s)
=== RUN   TestCollectionDropWithMultipleBuckets
2023/02/16 20:12:46 In TestCollectionWithDropMultipleBuckets()
2023/02/16 20:12:46 This test will create a collection across multiple buckets and 
2023/02/16 20:12:46 drops a collection on one bucket. Indexer should not drop indexes
2023/02/16 20:12:46 with same CollectionID but different buckets
2023/02/16 20:12:46 Creating test_bucket_1
2023/02/16 20:12:46 Created bucket test_bucket_1, responseBody: 
2023/02/16 20:12:56 Creating test_bucket_2
2023/02/16 20:12:56 Created bucket test_bucket_2, responseBody: 
2023-02-16T20:13:05.499+05:30 [Error] receiving packet: read tcp 127.0.0.1:56110->127.0.0.1:9107: i/o timeout
2023-02-16T20:13:05.499+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4688085527175897513) connection "127.0.0.1:56110" response transport failed `read tcp 127.0.0.1:56110->127.0.0.1:9107: i/o timeout`
2023-02-16T20:13:05.504+05:30 [Error] receiving packet: read tcp 127.0.0.1:41610->127.0.0.1:9107: i/o timeout
2023-02-16T20:13:05.504+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4688085527175897513) connection "127.0.0.1:41610" response transport failed `read tcp 127.0.0.1:41610->127.0.0.1:9107: i/o timeout`
2023-02-16T20:13:05.516+05:30 [Error] receiving packet: read tcp 127.0.0.1:41612->127.0.0.1:9107: i/o timeout
2023-02-16T20:13:05.516+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4688085527175897513) connection "127.0.0.1:41612" response transport failed `read tcp 127.0.0.1:41612->127.0.0.1:9107: i/o timeout`
2023/02/16 20:13:06 Creating collection: test for bucket: test_bucket_1
2023/02/16 20:13:06 Created collection succeeded for bucket: test_bucket_1, scope: _default, collection: test, body: {"uid":"1"}
2023/02/16 20:13:06 Creating collection: test for bucket: test_bucket_2
2023/02/16 20:13:07 Created collection succeeded for bucket: test_bucket_2, scope: _default, collection: test, body: {"uid":"1"}
2023/02/16 20:13:17 Creating Index: idx_1 on scope: _default collection: test for bucket: test_bucket_1
2023/02/16 20:13:20 Created the secondary index idx_1. Waiting for it become active
2023/02/16 20:13:20 Index is 9213030585968435564 now active
2023/02/16 20:13:25 Creating Index: idx_1 on scope: _default collection: test for bucket: test_bucket_2
2023/02/16 20:13:28 Created the secondary index idx_1. Waiting for it become active
2023/02/16 20:13:28 Index is 13646116153969571640 now active
2023/02/16 20:13:33 Dropping collection: test for bucket: test_bucket_1
2023/02/16 20:13:34 Dropped collection test for bucket: test_bucket_1, scope: _default, body: {"uid":"2"}
2023/02/16 20:13:36 Scanning index: idx_1, bucket: test_bucket_2
2023/02/16 20:13:36 Using n1ql client
2023-02-16T20:13:36.156+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:13:36.157+05:30 [Info] GSIC[default/test_bucket_2-_default-test-1676558616152966546] started ...
2023/02/16 20:13:36 Deleting bucket: test_bucket_2
2023/02/16 20:13:38 Deleted bucket test_bucket_2, responseBody: 
2023/02/16 20:13:43 Creating test_bucket_2
2023/02/16 20:13:43 Created bucket test_bucket_2, responseBody: 
2023/02/16 20:13:53 Creating collection: test for bucket: test_bucket_2
2023/02/16 20:13:53 Created collection succeeded for bucket: test_bucket_2, scope: _default, collection: test, body: {"uid":"1"}
2023/02/16 20:14:03 Creating Index: idx_1 on scope: _default collection: test for bucket: test_bucket_2
2023/02/16 20:14:08 Build command issued for the deferred indexes [idx_1], bucket: test_bucket_2, scope: _default, coll: test
2023/02/16 20:14:08 Waiting for the index idx_1 to become active
2023/02/16 20:14:08 Waiting for index 6856887983279632803 to go active ...
2023/02/16 20:14:09 Waiting for index 6856887983279632803 to go active ...
2023/02/16 20:14:10 Waiting for index 6856887983279632803 to go active ...
2023/02/16 20:14:11 Index is 6856887983279632803 now active
2023/02/16 20:14:11 Scanning index: idx_1, bucket: test_bucket_2
2023/02/16 20:14:11 Using n1ql client
2023/02/16 20:14:13 Deleted bucket test_bucket_1, responseBody: 
2023/02/16 20:14:14 Deleted bucket test_bucket_2, responseBody: 
--- PASS: TestCollectionDropWithMultipleBuckets (93.30s)
=== RUN   TestStringToByteSlice
--- PASS: TestStringToByteSlice (0.00s)
=== RUN   TestStringToByteSlice_Stack
--- PASS: TestStringToByteSlice_Stack (1.47s)
=== RUN   TestByteSliceToString
--- PASS: TestByteSliceToString (0.00s)
=== RUN   TestBytesToString_WithUnusedBytes
--- PASS: TestBytesToString_WithUnusedBytes (0.00s)
=== RUN   TestStringHeadersCompatible
--- PASS: TestStringHeadersCompatible (0.00s)
=== RUN   TestSliceHeadersCompatible
--- PASS: TestSliceHeadersCompatible (0.00s)
=== RUN   TestEphemeralBucketBasic
2023/02/16 20:14:21 In TestEphemeralBuckets()
2023/02/16 20:14:21 In DropAllSecondaryIndexes()
2023/02/16 20:14:21 Index found:  sc_cc_i1
2023/02/16 20:14:21 Dropped index sc_cc_i1
2023/02/16 20:14:21 Index found:  sc_cc_i2
2023/02/16 20:14:21 Dropped index sc_cc_i2
2023/02/16 20:14:21 Index found:  _default__default_i2
2023/02/16 20:14:21 Dropped index _default__default_i2
2023/02/16 20:14:21 Index found:  _default__default_i1
2023/02/16 20:14:21 Dropped index _default__default_i1
2023/02/16 20:14:57 Flushed the bucket default, Response body: 
2023/02/16 20:15:00 Modified parameters of bucket default, responseBody: 
2023/02/16 20:15:00 Created bucket ephemeral1, responseBody: 
2023/02/16 20:15:00 Created bucket ephemeral2, responseBody: 
2023/02/16 20:15:01 Created bucket ephemeral3, responseBody: 
2023/02/16 20:15:16 Generating docs and Populating all the buckets
2023/02/16 20:15:20 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 20:15:20 Index is 7870524835679461892 now active
2023/02/16 20:15:27 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 20:15:27 Index is 9342028484870619691 now active
2023/02/16 20:15:38 Created the secondary index bucket3_gender. Waiting for it become active
2023/02/16 20:15:38 Index is 15999034696166270947 now active
2023/02/16 20:15:46 Created the secondary index bucket4_balance. Waiting for it become active
2023/02/16 20:15:46 Index is 4594265976685634851 now active
2023/02/16 20:15:49 Using n1ql client
2023/02/16 20:15:49 Expected and Actual scan responses are the same
2023/02/16 20:15:49 Using n1ql client
2023-02-16T20:15:49.530+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:15:49.531+05:30 [Info] GSIC[default/ephemeral1-_default-_default-1676558749525906032] started ...
2023/02/16 20:15:49 Expected and Actual scan responses are the same
2023/02/16 20:15:49 Using n1ql client
2023-02-16T20:15:49.553+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:15:49.554+05:30 [Info] GSIC[default/ephemeral2-_default-_default-1676558749550001216] started ...
2023/02/16 20:15:49 Expected and Actual scan responses are the same
2023/02/16 20:15:49 Using n1ql client
2023-02-16T20:15:49.579+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:15:49.580+05:30 [Info] GSIC[default/ephemeral3-_default-_default-1676558749575672989] started ...
2023/02/16 20:15:49 Expected and Actual scan responses are the same
2023/02/16 20:15:50 Deleted bucket ephemeral1, responseBody: 
2023/02/16 20:15:52 Deleted bucket ephemeral2, responseBody: 
2023/02/16 20:15:54 Deleted bucket ephemeral3, responseBody: 
2023/02/16 20:15:57 Modified parameters of bucket default, responseBody: 
--- PASS: TestEphemeralBucketBasic (110.90s)
=== RUN   TestEphemeralBucketRecovery
2023/02/16 20:16:12 In TestEphemeralBucketRecovery()
2023/02/16 20:16:12 In DropAllSecondaryIndexes()
2023/02/16 20:16:12 Index found:  bucket1_age
2023/02/16 20:16:12 Dropped index bucket1_age
2023/02/16 20:16:48 Flushed the bucket default, Response body: 
2023/02/16 20:16:51 Modified parameters of bucket default, responseBody: 
2023/02/16 20:16:51 Created bucket ephemeral1, responseBody: 
2023/02/16 20:17:06 Generating docs and Populating all the buckets
2023/02/16 20:17:11 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 20:17:11 Index is 16383053681999914287 now active
2023/02/16 20:17:18 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 20:17:18 Index is 12293202247599315756 now active
Restarting indexer process ...
2023/02/16 20:17:21 []
2023-02-16T20:17:21.312+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:17:21.312+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:17:41 Using n1ql client
2023-02-16T20:17:41.257+05:30 [Error] transport error between 127.0.0.1:41614->127.0.0.1:9107: write tcp 127.0.0.1:41614->127.0.0.1:9107: write: broken pipe
2023-02-16T20:17:41.257+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] -8132305378621101246 request transport failed `write tcp 127.0.0.1:41614->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:17:41.257+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:17:41.257+05:30 [Error] metadataClient:PickRandom: Replicas - [9757699369099627506], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:17:41 Expected and Actual scan responses are the same
2023/02/16 20:17:41 Using n1ql client
2023-02-16T20:17:41.280+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:17:41.280+05:30 [Info] GSIC[default/ephemeral1-_default-_default-1676558861277383623] started ...
2023/02/16 20:17:41 Expected and Actual scan responses are the same
2023/02/16 20:17:43 Deleted bucket ephemeral1, responseBody: 
2023/02/16 20:17:46 Modified parameters of bucket default, responseBody: 
--- PASS: TestEphemeralBucketRecovery (108.89s)
=== RUN   TestEphemeralBucketFlush
2023/02/16 20:18:01 In TestEphemeralBucketFlush()
2023/02/16 20:18:01 In DropAllSecondaryIndexes()
2023/02/16 20:18:01 Index found:  bucket1_age
2023/02/16 20:18:01 Dropped index bucket1_age
2023/02/16 20:18:37 Flushed the bucket default, Response body: 
2023/02/16 20:18:40 Modified parameters of bucket default, responseBody: 
2023/02/16 20:18:40 Created bucket ephemeral1, responseBody: 
2023/02/16 20:18:55 Generating docs and Populating all the buckets
2023/02/16 20:18:59 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 20:18:59 Index is 7706199779869352918 now active
2023/02/16 20:19:06 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 20:19:06 Index is 1820040903924150687 now active
2023/02/16 20:19:45 Flushed the bucket default, Response body: 
2023/02/16 20:19:48 Flush Enabled on bucket ephemeral1, responseBody: 
2023/02/16 20:20:21 Flushed the bucket ephemeral1, Response body: 
2023/02/16 20:20:21 Using n1ql client
2023/02/16 20:20:21 Using n1ql client
2023-02-16T20:20:21.658+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:20:21.658+05:30 [Info] GSIC[default/ephemeral1-_default-_default-1676559021655707325] started ...
2023/02/16 20:20:22 Deleted bucket ephemeral1, responseBody: 
2023/02/16 20:20:25 Modified parameters of bucket default, responseBody: 
--- PASS: TestEphemeralBucketFlush (159.57s)
=== RUN   TestEphemeralBucketMCDCrash
2023/02/16 20:20:40 In TestEphemeralBucketMCDCrash()
2023/02/16 20:20:40 In DropAllSecondaryIndexes()
2023/02/16 20:20:40 Index found:  bucket1_age
2023/02/16 20:20:40 Dropped index bucket1_age
2023/02/16 20:21:16 Flushed the bucket default, Response body: 
2023/02/16 20:21:19 Modified parameters of bucket default, responseBody: 
2023/02/16 20:21:19 Created bucket ephemeral1, responseBody: 
2023/02/16 20:21:34 Generating docs and Populating all the buckets
2023/02/16 20:21:39 Created the secondary index bucket1_age. Waiting for it become active
2023/02/16 20:21:39 Index is 5842867856550422492 now active
2023/02/16 20:21:45 Created the secondary index bucket2_city. Waiting for it become active
2023/02/16 20:21:45 Index is 17979263322024175928 now active
Restarting memcached process ...
2023/02/16 20:21:48 []
2023/02/16 20:22:18 Using n1ql client
2023/02/16 20:22:18 Expected and Actual scan responses are the same
2023/02/16 20:22:18 Using n1ql client
2023-02-16T20:22:18.732+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:22:18.732+05:30 [Info] GSIC[default/ephemeral1-_default-_default-1676559138729599855] started ...
2023/02/16 20:22:19 Deleted bucket ephemeral1, responseBody: 
2023/02/16 20:22:22 Modified parameters of bucket default, responseBody: 
--- PASS: TestEphemeralBucketMCDCrash (116.91s)
=== RUN   TestScheduleIndexBasic
2023/02/16 20:22:37 In TestMultipleDeferredIndexes_BuildTogether()
2023/02/16 20:22:37 In DropAllSecondaryIndexes()
2023/02/16 20:22:37 Index found:  bucket1_age
2023/02/16 20:22:37 Dropped index bucket1_age
2023/02/16 20:22:49 Setting JSON docs in KV
2023/02/16 20:23:11 Changing config key indexer.debug.enableBackgroundIndexCreation to value false
2023/02/16 20:23:11 Creating indexes Asynchronously
2023/02/16 20:23:11 Finding definition IDs for all indexes
2023/02/16 20:23:14 Status of all indexes
2023/02/16 20:23:14 Index id_isActive is in state INDEX_STATE_INITIAL
2023/02/16 20:23:14 Index id_age is in state INDEX_STATE_SCHEDULED
2023/02/16 20:23:15 Changing config key indexer.debug.enableBackgroundIndexCreation to value true
2023/02/16 20:23:15 Waiting for all indexes to become active
2023/02/16 20:23:15 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:15 Waiting for index 7167584549702532172 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:15 Waiting for index 3529101262718150388 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:15 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023-02-16T20:23:17.372+05:30 [Error] Error in waitForScheduledIndex Index id_company will retry building in the background for reason: Build Already In Progress. Keyspace default.
2023/02/16 20:23:18 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:18 Waiting for index 7167584549702532172 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:18 Waiting for index 3529101262718150388 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:18 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:21 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:21 Waiting for index 7167584549702532172 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:21 Waiting for index 3529101262718150388 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:21 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023-02-16T20:23:23.166+05:30 [Error] Error in waitForScheduledIndex Index id_age will retry building in the background for reason: Build Already In Progress. Keyspace default.
2023/02/16 20:23:24 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:24 Index 7167584549702532172 is now active
2023/02/16 20:23:24 Waiting for index 3529101262718150388 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:24 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:27 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:27 Waiting for index 3529101262718150388 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:27 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:30 Waiting for index 7343610774201735146 in state INDEX_STATE_READY to go active ...
2023/02/16 20:23:30 Waiting for index 3529101262718150388 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:30 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023-02-16T20:23:31.516+05:30 [Error] Error in waitForScheduledIndex Index id_gender will retry building in the background for reason: Build Already In Progress. Keyspace default.
2023/02/16 20:23:31 Error Index id_company will retry building in the background for reason: Build Already In Progress. Keyspace default.
 Observed when creating index
2023/02/16 20:23:31 Error Index id_age will retry building in the background for reason: Build Already In Progress. Keyspace default.
 Observed when creating index
2023/02/16 20:23:31 Error Index id_gender will retry building in the background for reason: Build Already In Progress. Keyspace default.
 Observed when creating index
2023/02/16 20:23:31 Error  Observed when creating index
2023/02/16 20:23:33 Waiting for index 7343610774201735146 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:33 Waiting for index 3529101262718150388 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:33 Waiting for index 102014911822756538 in state INDEX_STATE_ERROR to go active ...
2023/02/16 20:23:36 Waiting for index 7343610774201735146 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:36 Index 3529101262718150388 is now active
2023/02/16 20:23:36 Waiting for index 102014911822756538 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:39 Waiting for index 7343610774201735146 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:39 Waiting for index 102014911822756538 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:42 Waiting for index 7343610774201735146 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:42 Waiting for index 102014911822756538 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:45 Waiting for index 7343610774201735146 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:45 Waiting for index 102014911822756538 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:48 Waiting for index 7343610774201735146 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:48 Waiting for index 102014911822756538 in state INDEX_STATE_INITIAL to go active ...
2023/02/16 20:23:51 Index 7343610774201735146 is now active
2023/02/16 20:23:51 Index 102014911822756538 is now active
--- PASS: TestScheduleIndexBasic (73.36s)
=== RUN   TestFlattenArrayIndexTestSetup
2023/02/16 20:23:51 In DropAllSecondaryIndexes()
2023/02/16 20:23:51 Index found:  id_gender
2023/02/16 20:23:51 Dropped index id_gender
2023/02/16 20:23:51 Index found:  id_company
2023/02/16 20:23:51 Dropped index id_company
2023/02/16 20:23:51 Index found:  id_isActive
2023/02/16 20:23:51 Dropped index id_isActive
2023/02/16 20:23:51 Index found:  id_age
2023/02/16 20:23:51 Dropped index id_age
2023/02/16 20:24:26 Flushed the bucket default, Response body: 
--- PASS: TestFlattenArrayIndexTestSetup (40.89s)
=== RUN   TestScanOnFlattenedAraryIndex
2023/02/16 20:24:39 Created the secondary index idx_flatten. Waiting for it become active
2023/02/16 20:24:39 Index is 772675510425983966 now active
2023/02/16 20:24:39 Using n1ql client
--- PASS: TestScanOnFlattenedAraryIndex (8.09s)
=== RUN   TestGroupAggrFlattenArrayIndex
2023/02/16 20:24:40 In TestGroupAggrArrayIndex()
2023/02/16 20:24:46 Created the secondary index ga_flatten_arr1. Waiting for it become active
2023/02/16 20:24:46 Index is 9855406855927107801 now active
2023/02/16 20:24:53 Created the secondary index ga_flatten_arr2. Waiting for it become active
2023/02/16 20:24:53 Index is 2003364562900916305 now active
2023/02/16 20:24:53 Scenario 1
2023-02-16T20:24:53.199+05:30 [Error] transport error between 127.0.0.1:56112->127.0.0.1:9107: write tcp 127.0.0.1:56112->127.0.0.1:9107: write: broken pipe
2023-02-16T20:24:53.199+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"]  request transport failed `write tcp 127.0.0.1:56112->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:24:53.200+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:24:53.200+05:30 [Error] metadataClient:PickRandom: Replicas - [11147177844225092380], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:24:53 Total Scanresults = 644
2023/02/16 20:24:53 Scenario 2
2023/02/16 20:24:53 Total Scanresults = 2864
2023/02/16 20:24:54 Scenario 3
2023/02/16 20:24:54 Total Scanresults = 1
2023/02/16 20:24:54 Scenario 4
2023/02/16 20:24:54 Total Scanresults = 995
2023/02/16 20:24:55 Scenario 5
2023/02/16 20:24:55 Total Scanresults = 2864
2023/02/16 20:24:56 Scenario 6
2023/02/16 20:24:56 Total Scanresults = 1
2023/02/16 20:24:56 Scenario 7
2023/02/16 20:24:56 Total Scanresults = 2970
2023/02/16 20:24:59 Scenario 8
2023/02/16 20:24:59 Total Scanresults = 2864
2023/02/16 20:25:00 Scenario 9
2023/02/16 20:25:00 Total Scanresults = 1
2023/02/16 20:25:00 Scenario 10
2023/02/16 20:25:00 Total Scanresults = 644
2023/02/16 20:25:01 Scenario 11
2023/02/16 20:25:01 Total Scanresults = 1191
2023/02/16 20:25:01 Scenario 12
2023/02/16 20:25:01 Total Scanresults = 1
2023/02/16 20:25:01 Scenario 13
2023/02/16 20:25:06 Total Scanresults = 1
2023/02/16 20:25:06 Count of scanResults is 1
2023/02/16 20:25:06 Value: [1 21]
--- PASS: TestGroupAggrFlattenArrayIndex (27.04s)
=== RUN   TestNullAndMissingValuesFlattenArrayIndex
2023/02/16 20:25:07 In TestNullAndMissingValuesFlattenArrayIndex
2023/02/16 20:25:08 Scenario-1: Scanning for docsWithNullEntries with array as non-leading key
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-2: Scanning for docsWithMissingEntries with array as non-leading key
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-3: Scanning for docs with 'missing' entry for first key in array expression with array as non-leading key
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-4: Scanning for docs with valid entry for first key in array expression with array as non-leading key
2023/02/16 20:25:08 Add docs in docsWithPartialMissingLeadingKeyInArrEntry should be present in results
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-5: Scanning for docsWithNullEntries with array as leading key
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-6: Scanning for docsWithMissingEntries with array as leading entry
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-7: Scanning for docs with 'missing' entry for first key in array expression
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-9: Scanning for all docs in the index
2023/02/16 20:25:08 Docs in docsWithCompleteMissingLeadingKeyInArrEntry should be present in results
2023/02/16 20:25:08 Using n1ql client
2023/02/16 20:25:08 Scenario-9: Scanning for docs with valid entry for first key in array expression
2023/02/16 20:25:08 Add docs in docsWithPartialMissingLeadingKeyInArrEntry should be present in results
2023/02/16 20:25:08 Using n1ql client
--- PASS: TestNullAndMissingValuesFlattenArrayIndex (1.39s)
=== RUN   TestEmptyArrayFlattenArrayIndex
2023/02/16 20:25:08 In TestEmptyArrayFlattenArrayIndex
2023/02/16 20:25:09 Scenario-1: Scanning for docs with missing entry for first key in array expression
2023/02/16 20:25:09 The docs in `docsWithEmptyArrayEntry` should not be presnt in scanResults
2023/02/16 20:25:09 Using n1ql client
2023/02/16 20:25:09 Scenario-1: Scanning for all docs in the index
2023/02/16 20:25:09 The docs in `docsWithEmptyArrayEntry` should not be presnt in scanResults
2023/02/16 20:25:09 Using n1ql client
--- PASS: TestEmptyArrayFlattenArrayIndex (1.36s)
=== RUN   TestOSOSetup
2023/02/16 20:25:09 In TestOSOSetup()
2023/02/16 20:25:09 In DropAllSecondaryIndexes()
2023/02/16 20:25:09 Index found:  test_oneperprimarykey
2023/02/16 20:25:09 Dropped index test_oneperprimarykey
2023/02/16 20:25:09 Index found:  #primary
2023/02/16 20:25:10 Dropped index #primary
2023/02/16 20:25:10 Index found:  idx_flatten
2023/02/16 20:25:10 Dropped index idx_flatten
2023/02/16 20:25:10 Index found:  ga_flatten_arr2
2023/02/16 20:25:10 Dropped index ga_flatten_arr2
2023/02/16 20:25:10 Index found:  ga_flatten_arr1
2023/02/16 20:25:10 Dropped index ga_flatten_arr1
2023/02/16 20:25:45 Flushed the bucket default, Response body: 
2023/02/16 20:25:47 Populating the default bucket
2023/02/16 20:25:51 Changing config key indexer.build.enableOSO to value true
--- PASS: TestOSOSetup (41.92s)
=== RUN   TestOSOInitBuildDeleteMutation
2023/02/16 20:25:51 In TestOSOInitBuildDeleteMutation()
2023/02/16 20:25:56 Created the secondary index index_p1_oso. Waiting for it become active
2023/02/16 20:25:56 Index is 13452342302982729298 now active
2023/02/16 20:26:04 Created the secondary index index_p_oso. Waiting for it become active
2023/02/16 20:26:04 Index is 199275093785331158 now active
2023/02/16 20:26:04 Expected and Actual scan responses are the same
2023/02/16 20:26:04 CountRange() expected and actual is:  1793 and 1793
2023/02/16 20:26:04 lookupkey for CountLookup() = User8e10ca9d-ebd0-43a1-b2f4-25b5cee987c9
2023/02/16 20:26:04 CountLookup() = 1
--- PASS: TestOSOInitBuildDeleteMutation (13.01s)
=== RUN   TestOSOInitBuildIndexerRestart
2023/02/16 20:26:04 In TestOSOInitBuildIndexerRestart()
2023/02/16 20:26:04 Build command issued for the deferred indexes [13598204097549389813]
2023/02/16 20:26:05 []
2023-02-16T20:26:05.043+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:26:05.044+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:26:10 Waiting for index 13598204097549389813 to go active ...
2023/02/16 20:26:11 Waiting for index 13598204097549389813 to go active ...
2023/02/16 20:26:12 Waiting for index 13598204097549389813 to go active ...
2023/02/16 20:26:13 Index is 13598204097549389813 now active
2023-02-16T20:26:13.008+05:30 [Error] transport error between 127.0.0.1:48778->127.0.0.1:9107: write tcp 127.0.0.1:48778->127.0.0.1:9107: write: broken pipe
2023-02-16T20:26:13.008+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"]  request transport failed `write tcp 127.0.0.1:48778->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:26:13.008+05:30 [Warn] scan failed: requestId  queryport 127.0.0.1:9107 inst 6409562004817241344 partition [0]
2023-02-16T20:26:13.008+05:30 [Warn] Scan failed with error for index 13598204097549389813.  Trying scan again with replica, reqId: :  write tcp 127.0.0.1:48778->127.0.0.1:9107: write: broken pipe from [127.0.0.1:9107] ...
2023/02/16 20:26:13 Expected and Actual scan responses are the same
2023/02/16 20:26:13 CountRange() expected and actual is:  1793 and 1793
2023/02/16 20:26:13 lookupkey for CountLookup() = User77d6aa9e-c9f7-432a-a111-c39dda99157f
2023/02/16 20:26:13 CountLookup() = 1
--- PASS: TestOSOInitBuildIndexerRestart (8.35s)
=== RUN   TestMissingLeadingKeyBasic
2023/02/16 20:26:13 In DropAllSecondaryIndexes()
2023/02/16 20:26:13 Index found:  index_p1_oso
2023/02/16 20:26:13 Dropped index index_p1_oso
2023/02/16 20:26:13 Index found:  index_p_oso
2023/02/16 20:26:13 Dropped index index_p_oso
2023/02/16 20:26:13 Index found:  index_p2_oso
2023/02/16 20:26:13 Dropped index index_p2_oso
2023/02/16 20:26:49 Flushed the bucket default, Response body: 
2023/02/16 20:26:49 Populating the default bucket
--- PASS: TestMissingLeadingKeyBasic (41.54s)
=== RUN   TestMissingLeadingKeyPartitioned
2023/02/16 20:26:54 In DropAllSecondaryIndexes()
2023/02/16 20:26:54 Index found:  idx_vac
2023/02/16 20:26:54 Dropped index idx_vac
--- PASS: TestMissingLeadingKeyPartitioned (6.97s)
=== RUN   TestIdxCorruptBasicSanityMultipleIndices
2023/02/16 20:27:01 In DropAllSecondaryIndexes()
2023/02/16 20:27:01 Index found:  idx_doses_partn
2023/02/16 20:27:01 Dropped index idx_doses_partn
Creating two indices ...
2023/02/16 20:27:15 Created the secondary index corrupt_idx1_age. Waiting for it become active
2023/02/16 20:27:15 Index is 17003839809879993289 now active
2023/02/16 20:27:22 Created the secondary index corrupt_idx2_company. Waiting for it become active
2023/02/16 20:27:22 Index is 701919686880015180 now active
hosts = [127.0.0.1:9108]
2023/02/16 20:27:23 Corrupting index corrupt_idx1_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx1_age_1976605807332316746_0.index
2023/02/16 20:27:23 Corrupting index corrupt_idx1_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx1_age_1976605807332316746_0.index/mainIndex/error
Restarting indexer process ...
2023/02/16 20:27:23 []
2023-02-16T20:27:23.251+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:27:23.254+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:27:44 Using n1ql client
2023-02-16T20:27:44.197+05:30 [Error] transport error between 127.0.0.1:58902->127.0.0.1:9107: write tcp 127.0.0.1:58902->127.0.0.1:9107: write: broken pipe
2023-02-16T20:27:44.197+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 3658353801334854334 request transport failed `write tcp 127.0.0.1:58902->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:27:44.197+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:27:44.197+05:30 [Error] metadataClient:PickRandom: Replicas - [17979954307723562466], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:27:44 Using n1ql client
--- PASS: TestIdxCorruptBasicSanityMultipleIndices (42.56s)
=== RUN   TestIdxCorruptPartitionedIndex
Creating partitioned index ...
2023/02/16 20:27:49 Created the secondary index corrupt_idx3_age. Waiting for it become active
2023/02/16 20:27:49 Index is 17933589499678938929 now active
hosts = [127.0.0.1:9108]
indexer.numPartitions = 8
Corrupting partn id 8
Getting slicepath for  1
slicePath for partn 1 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_1.index
Getting slicepath for  2
slicePath for partn 2 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_2.index
Getting slicepath for  3
slicePath for partn 3 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_3.index
Getting slicepath for  4
slicePath for partn 4 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_4.index
Getting slicepath for  5
slicePath for partn 5 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_5.index
Getting slicepath for  6
slicePath for partn 6 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_6.index
Getting slicepath for  7
slicePath for partn 7 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_7.index
Getting slicepath for  8
slicePath for partn 8 = /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_8.index
2023/02/16 20:27:49 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_8.index
2023/02/16 20:27:49 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_8.index/mainIndex/error
Restarting indexer process ...
2023/02/16 20:27:49 []
2023-02-16T20:27:49.359+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:27:49.363+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:28:09 Using n1ql client
2023-02-16T20:28:09.324+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 8.  Partition with instances 7 
2023-02-16T20:28:09.324+05:30 [Error] metadataClient:PickRandom: Replicas - [14166242324144997706], PrunedReplica - map[], FilteredReplica map[]
2023-02-16T20:28:09.335+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 8.  Partition with instances 7 
2023-02-16T20:28:09.335+05:30 [Error] metadataClient:PickRandom: Replicas - [14166242324144997706], PrunedReplica - map[], FilteredReplica map[]
Scan error: All indexer replica is down or unavailable or unable to process request - cause: queryport.client.noHost
Verified single partition corruption
Restarting indexer process ...
2023/02/16 20:28:09 []
2023-02-16T20:28:09.384+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:28:09.384+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:28:29 Using n1ql client
2023-02-16T20:28:29.354+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 8.  Partition with instances 7 
2023-02-16T20:28:29.354+05:30 [Error] metadataClient:PickRandom: Replicas - [14166242324144997706], PrunedReplica - map[], FilteredReplica map[]
2023-02-16T20:28:29.364+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 8.  Partition with instances 7 
2023-02-16T20:28:29.364+05:30 [Error] metadataClient:PickRandom: Replicas - [14166242324144997706], PrunedReplica - map[], FilteredReplica map[]
Scan error: All indexer replica is down or unavailable or unable to process request - cause: queryport.client.noHost
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_1.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_1.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_2.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_2.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_3.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_3.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_4.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_4.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_5.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_5.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_6.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_6.index/mainIndex/error
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_7.index
2023/02/16 20:28:29 Corrupting index corrupt_idx3_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx3_age_14166242324144997706_7.index/mainIndex/error
Skip corrupting partition 8
Restarting indexer process ...
2023/02/16 20:28:29 []
2023-02-16T20:28:29.415+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:28:29.416+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:28:49 Using n1ql client
--- PASS: TestIdxCorruptPartitionedIndex (65.18s)
=== RUN   TestIdxCorruptMOITwoSnapsOneCorrupt
Not running TestMOITwoSnapsOneCorrupt for plasma
--- PASS: TestIdxCorruptMOITwoSnapsOneCorrupt (0.00s)
=== RUN   TestIdxCorruptMOITwoSnapsBothCorrupt
Not running TestMOITwoSnapsBothCorrupt for plasma
--- PASS: TestIdxCorruptMOITwoSnapsBothCorrupt (0.00s)
=== RUN   TestIdxCorruptBackup
2023/02/16 20:28:49 Changing config key indexer.settings.enable_corrupt_index_backup to value true
Creating index ...
2023/02/16 20:28:53 Created the secondary index corrupt_idx6_age. Waiting for it become active
2023/02/16 20:28:53 Index is 3153704952193117688 now active
hosts = [127.0.0.1:9108]
2023/02/16 20:28:53 Corrupting index corrupt_idx6_age slicePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx6_age_6106511770866394065_0.index
2023/02/16 20:28:53 Corrupting index corrupt_idx6_age mainIndexErrFilePath /opt/build/ns_server/data/n_1/data/@2i/default_corrupt_idx6_age_6106511770866394065_0.index/mainIndex/error
Restarting indexer process ...
2023/02/16 20:28:54 []
2023-02-16T20:28:54.041+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:28:54.042+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
--- PASS: TestIdxCorruptBackup (24.62s)
=== RUN   TestStatsPersistence
2023/02/16 20:29:14 In TestStatsPersistence()
2023/02/16 20:29:14 In DropAllSecondaryIndexes()
2023/02/16 20:29:14 Index found:  corrupt_idx2_company
2023/02/16 20:29:14 Dropped index corrupt_idx2_company
2023/02/16 20:29:18 Created the secondary index index_age. Waiting for it become active
2023/02/16 20:29:18 Index is 5823798538076609826 now active
2023/02/16 20:29:26 Created the secondary index index_gender. Waiting for it become active
2023/02/16 20:29:26 Index is 236316518792846618 now active
2023/02/16 20:29:32 Created the secondary index index_city. Waiting for it become active
2023/02/16 20:29:32 Index is 259361533506350529 now active
2023/02/16 20:29:39 Created the secondary index p1. Waiting for it become active
2023/02/16 20:29:39 Index is 15258446512795526019 now active
2023/02/16 20:29:39 === Testing for persistence enabled = true, with interval = 5  ===
2023/02/16 20:29:39 Changing config key indexer.statsPersistenceInterval to value 5
2023/02/16 20:29:39 []
2023-02-16T20:29:39.538+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:29:39.538+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:29:44 Using n1ql client
2023-02-16T20:29:44.508+05:30 [Error] transport error between 127.0.0.1:56528->127.0.0.1:9107: write tcp 127.0.0.1:56528->127.0.0.1:9107: write: broken pipe
2023-02-16T20:29:44.508+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] -7887268331313951776 request transport failed `write tcp 127.0.0.1:56528->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:29:44.508+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:29:44.508+05:30 [Error] metadataClient:PickRandom: Replicas - [1672783045289287592], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:29:44 Using n1ql client
2023/02/16 20:29:44 Using n1ql client
2023/02/16 20:29:51 []
2023-02-16T20:29:51.673+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:29:51.673+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:29:56 === Testing for persistence enabled = false, with interval = 0  ===
2023/02/16 20:29:56 Changing config key indexer.statsPersistenceInterval to value 0
2023/02/16 20:29:56 []
2023/02/16 20:30:01 Using n1ql client
2023-02-16T20:30:01.975+05:30 [Error] transport error between 127.0.0.1:34210->127.0.0.1:9107: write tcp 127.0.0.1:34210->127.0.0.1:9107: write: broken pipe
2023-02-16T20:30:01.975+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] 2526160612775650438 request transport failed `write tcp 127.0.0.1:34210->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:30:01.976+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:30:01.976+05:30 [Error] metadataClient:PickRandom: Replicas - [1672783045289287592], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:30:02 Using n1ql client
2023/02/16 20:30:02 Using n1ql client
2023/02/16 20:30:04 []
2023-02-16T20:30:04.147+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:30:04.148+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:30:09 === Testing for persistence enabled = true, with interval = 10  ===
2023/02/16 20:30:09 Changing config key indexer.statsPersistenceInterval to value 10
2023/02/16 20:30:09 []
2023/02/16 20:30:14 Using n1ql client
2023-02-16T20:30:14.459+05:30 [Error] transport error between 127.0.0.1:35200->127.0.0.1:9107: write tcp 127.0.0.1:35200->127.0.0.1:9107: write: broken pipe
2023-02-16T20:30:14.459+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] -1158932399369912719 request transport failed `write tcp 127.0.0.1:35200->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:30:14.459+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:30:14.459+05:30 [Error] metadataClient:PickRandom: Replicas - [1672783045289287592], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:30:14 Using n1ql client
2023/02/16 20:30:14 Using n1ql client
2023/02/16 20:30:26 []
2023-02-16T20:30:26.621+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:30:26.621+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023/02/16 20:30:31 === Testing for persistence enabled = true, with interval = 5  ===
2023/02/16 20:30:31 Changing config key indexer.statsPersistenceInterval to value 5
2023/02/16 20:30:31 []
2023/02/16 20:30:36 Using n1ql client
2023-02-16T20:30:36.956+05:30 [Error] transport error between 127.0.0.1:36010->127.0.0.1:9107: write tcp 127.0.0.1:36010->127.0.0.1:9107: write: broken pipe
2023-02-16T20:30:36.956+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] -7062207298105427214 request transport failed `write tcp 127.0.0.1:36010->127.0.0.1:9107: write: broken pipe`
2023-02-16T20:30:36.956+05:30 [Error] metadataClient:PickRandom: Fail to find indexer for all index partitions. Num partition 1.  Partition with instances 0 
2023-02-16T20:30:36.956+05:30 [Error] metadataClient:PickRandom: Replicas - [1672783045289287592], PrunedReplica - map[], FilteredReplica map[]
2023/02/16 20:30:36 Using n1ql client
2023/02/16 20:30:37 Using n1ql client
2023/02/16 20:30:44 []
2023-02-16T20:30:44.168+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
2023-02-16T20:30:44.168+05:30 [Error] WatcherServer.runOnce() : Watcher terminated unexpectedly.
--- PASS: TestStatsPersistence (95.21s)
=== RUN   TestStats_StorageStatistics
2023/02/16 20:30:49 In TestStats_StorageStatistics()
2023/02/16 20:30:49 Index found:  index_age
2023/02/16 20:30:49 Stats from Index4 StorageStatistics for index index_age are [map[AVG_ITEM_SIZE:48 AVG_PAGE_SIZE:13356 NUM_DELETE:0 NUM_INSERT:0 NUM_ITEMS:2000 NUM_PAGES:8 PARTITION_ID:0 RESIDENT_RATIO:0]]
2023/02/16 20:30:49 Index found:  p1
2023/02/16 20:30:49 Stats from Index4 StorageStatistics for index p1 are [map[AVG_ITEM_SIZE:36 AVG_PAGE_SIZE:10046 NUM_DELETE:0 NUM_INSERT:0 NUM_ITEMS:2000 NUM_PAGES:6 PARTITION_ID:0 RESIDENT_RATIO:0]]
--- PASS: TestStats_StorageStatistics (0.21s)
FAIL
exit status 1
FAIL	github.com/couchbase/indexing/secondary/tests/functionaltests	8577.502s
Indexer Go routine dump logged in /opt/build/ns_server/logs/n_1/indexer_functests_pprof.log
curl: /opt/build/install/lib/libcurl.so.4: no version information available (required by curl)
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 68812    0 68812    0     0  9195k      0 --:--:-- --:--:-- --:--:-- 9599k
2023/02/16 20:30:53 In TestMain()
2023/02/16 20:30:54 Changing config key indexer.api.enableTestServer to value true
2023/02/16 20:30:54 Using plasma for creating indexes
2023/02/16 20:30:54 Changing config key indexer.settings.storage_mode to value plasma
=== RUN   TestRangeWithConcurrentAddMuts
2023/02/16 20:30:59 In TestRangeWithConcurrentAddMuts()
2023/02/16 20:30:59 In DropAllSecondaryIndexes()
2023/02/16 20:30:59 Index found:  index_age
2023/02/16 20:30:59 Dropped index index_age
2023/02/16 20:30:59 Index found:  p1
2023/02/16 20:30:59 Dropped index p1
2023/02/16 20:30:59 Index found:  index_gender
2023/02/16 20:30:59 Dropped index index_gender
2023/02/16 20:30:59 Index found:  index_city
2023/02/16 20:30:59 Dropped index index_city
2023/02/16 20:30:59 Generating JSON docs
2023/02/16 20:30:59 Setting initial JSON docs in KV
2023/02/16 20:31:00 All indexers are active
2023/02/16 20:31:00 Creating a 2i
2023/02/16 20:31:03 Created the secondary index index_company. Waiting for it become active
2023/02/16 20:31:03 Index is 4193923921916802179 now active
2023/02/16 20:31:03 In Range Scan for Thread 1: 
2023/02/16 20:31:03 CreateDocs:: Creating mutations
2023/02/16 20:31:03 ListAllSecondaryIndexes() for Thread 1: : Index index_company Bucket default
--- PASS: TestRangeWithConcurrentAddMuts (124.06s)
=== RUN   TestRangeWithConcurrentDelMuts
2023/02/16 20:33:03 In TestRangeWithConcurrentDelMuts()
2023/02/16 20:33:03 Generating JSON docs
2023/02/16 20:33:05 Setting initial JSON docs in KV
2023/02/16 20:33:23 All indexers are active
2023/02/16 20:33:23 Creating a 2i
2023/02/16 20:33:23 Index found:  index_company
2023/02/16 20:33:23 In Range Scan for Thread 1: 
2023/02/16 20:33:23 CreateDocs:: Delete mutations
2023/02/16 20:33:23 ListAllSecondaryIndexes() for Thread 1: : Index index_company Bucket default
--- PASS: TestRangeWithConcurrentDelMuts (140.02s)
=== RUN   TestScanWithConcurrentIndexOps
2023/02/16 20:35:23 In TestScanWithConcurrentIndexOps()
2023/02/16 20:35:23 Generating JSON docs
2023/02/16 20:35:28 Setting initial JSON docs in KV
2023/02/16 20:36:35 All indexers are active
2023/02/16 20:36:35 Creating a 2i
2023/02/16 20:36:35 Index found:  index_company
2023/02/16 20:36:35 In Range Scan for Thread 1: 
2023/02/16 20:36:35 Create and Drop index operations
2023/02/16 20:36:35 ListAllSecondaryIndexes() for Thread 1: : Index index_company Bucket default
2023/02/16 20:37:01 Created the secondary index index_age. Waiting for it become active
2023/02/16 20:37:01 Index is 8943655041817435040 now active
2023/02/16 20:37:25 Created the secondary index index_firstname. Waiting for it become active
2023/02/16 20:37:25 Index is 15095996613825123498 now active
2023/02/16 20:37:27 Dropping the secondary index index_age
2023/02/16 20:37:27 Index dropped
2023/02/16 20:37:28 Dropping the secondary index index_firstname
2023/02/16 20:37:28 Index dropped
2023/02/16 20:37:52 Created the secondary index index_age. Waiting for it become active
2023/02/16 20:37:52 Index is 5599246132219547088 now active
2023/02/16 20:38:16 Created the secondary index index_firstname. Waiting for it become active
2023/02/16 20:38:16 Index is 17654120542313815285 now active
2023/02/16 20:38:17 Dropping the secondary index index_age
2023/02/16 20:38:17 Index dropped
2023/02/16 20:38:18 Dropping the secondary index index_firstname
2023/02/16 20:38:18 Index dropped
2023/02/16 20:38:39 Created the secondary index index_age. Waiting for it become active
2023/02/16 20:38:39 Index is 15244844989776673616 now active
2023/02/16 20:39:00 Created the secondary index index_firstname. Waiting for it become active
2023/02/16 20:39:00 Index is 1609923575906394001 now active
2023/02/16 20:39:01 Dropping the secondary index index_age
2023/02/16 20:39:01 Index dropped
2023/02/16 20:39:02 Dropping the secondary index index_firstname
2023/02/16 20:39:02 Index dropped
--- PASS: TestScanWithConcurrentIndexOps (220.55s)
=== RUN   TestConcurrentScans_SameIndex
2023/02/16 20:39:03 In TestConcurrentScans_SameIndex()
2023/02/16 20:39:03 Generating JSON docs
2023/02/16 20:39:09 Setting initial JSON docs in KV
2023/02/16 20:40:14 All indexers are active
2023/02/16 20:40:14 Creating a 2i
2023/02/16 20:40:14 Index found:  index_company
2023/02/16 20:40:14 In Range Scan for Thread 6: 
2023/02/16 20:40:14 In Range Scan for Thread 1: 
2023/02/16 20:40:14 In Range Scan for Thread 2: 
2023/02/16 20:40:14 In Range Scan for Thread 3: 
2023/02/16 20:40:14 In Range Scan for Thread 4: 
2023/02/16 20:40:14 In Range Scan for Thread 5: 
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 1: : Index index_company Bucket default
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 2: : Index index_company Bucket default
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 4: : Index index_company Bucket default
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 6: : Index index_company Bucket default
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 3: : Index index_company Bucket default
2023/02/16 20:40:15 ListAllSecondaryIndexes() for Thread 5: : Index index_company Bucket default
--- PASS: TestConcurrentScans_SameIndex (192.47s)
=== RUN   TestConcurrentScans_MultipleIndexes
2023/02/16 20:42:16 In TestConcurrentScans_MultipleIndexes()
2023/02/16 20:42:16 Generating JSON docs
2023/02/16 20:42:21 Setting initial JSON docs in KV
2023/02/16 20:43:28 All indexers are active
2023/02/16 20:43:28 Creating multiple indexes
2023/02/16 20:43:28 Index found:  index_company
2023/02/16 20:44:01 Created the secondary index index_age. Waiting for it become active
2023/02/16 20:44:01 Index is 2180492059873935154 now active
2023/02/16 20:44:32 Created the secondary index index_firstname. Waiting for it become active
2023/02/16 20:44:32 Index is 12848359773278628072 now active
2023/02/16 20:44:32 In Range Scan for Thread 3: 
2023/02/16 20:44:32 In Range Scan for Thread 1: 
2023/02/16 20:44:32 In Range Scan
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 3: : Index index_firstname Bucket default
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 3: : Index index_age Bucket default
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 3: : Index index_company Bucket default
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 1: : Index index_age Bucket default
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 1: : Index index_company Bucket default
2023/02/16 20:44:32 ListAllSecondaryIndexes() for Thread 1: : Index index_firstname Bucket default
--- PASS: TestConcurrentScans_MultipleIndexes (257.27s)
=== RUN   TestMutationsWithMultipleIndexBuilds
2023/02/16 20:46:33 In TestMutationsWithMultipleIndexBuilds()
2023/02/16 20:46:33 In DropAllSecondaryIndexes()
2023/02/16 20:46:33 Index found:  index_company
2023/02/16 20:46:33 Dropped index index_company
2023/02/16 20:46:33 Index found:  index_age
2023/02/16 20:46:33 Dropped index index_age
2023/02/16 20:46:33 Index found:  index_firstname
2023/02/16 20:46:33 Dropped index index_firstname
2023/02/16 20:46:33 Generating JSON docs
2023/02/16 20:46:39 Setting initial JSON docs in KV
2023/02/16 20:47:57 Created the secondary index index_primary. Waiting for it become active
2023/02/16 20:47:57 Index is 11894764000164495160 now active
2023/02/16 20:47:57 Creating multiple indexes in deferred mode
2023/02/16 20:47:58 Build Indexes and wait for indexes to become active: [index_company index_age index_firstname index_lastname]
2023/02/16 20:47:58 Build command issued for the deferred indexes [index_company index_age index_firstname index_lastname], bucket: default, scope: _default, coll: _default
2023/02/16 20:47:58 Waiting for the index index_company to become active
2023/02/16 20:47:58 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:47:59 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:00 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:01 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:02 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:03 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:04 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:05 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:06 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:07 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:08 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:09 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:10 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:11 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:12 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:13 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:14 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:15 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:16 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:17 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:18 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:19 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:20 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:21 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:22 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:23 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:24 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:25 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:26 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:27 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:28 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:29 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:30 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:31 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:32 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:33 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:34 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:35 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:36 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:37 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:38 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:39 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:40 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:41 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:42 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:43 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:44 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:45 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:46 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:47 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:48 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:49 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:50 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:51 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:52 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:53 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:54 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:55 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:56 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:57 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:58 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:48:59 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:00 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:01 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:02 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:03 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:04 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:05 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:06 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:07 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:08 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:09 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:10 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:11 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:12 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:13 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:14 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:15 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:16 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:17 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:18 Waiting for index 16768131804120255922 to go active ...
2023/02/16 20:49:19 Index is 16768131804120255922 now active
2023/02/16 20:49:19 Waiting for the index index_age to become active
2023/02/16 20:49:19 Index is 13339998810366752032 now active
2023/02/16 20:49:19 Waiting for the index index_firstname to become active
2023/02/16 20:49:19 Index is 8894321539503304772 now active
2023/02/16 20:49:19 Waiting for the index index_lastname to become active
2023/02/16 20:49:19 Index is 13578839462448210708 now active
--- PASS: TestMutationsWithMultipleIndexBuilds (165.80s)
PASS
ok  	github.com/couchbase/indexing/secondary/tests/largedatatests	1105.652s
Indexer Go routine dump logged in /opt/build/ns_server/logs/n_1/indexer_largedata_pprof.log
curl: /opt/build/install/lib/libcurl.so.4: no version information available (required by curl)
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 67972    0 67972    0     0  9261k      0 --:--:-- --:--:-- --:--:-- 10.8M

Serverless tests

Starting server: attempt 1

Serverless tests

2023/02/16 20:52:23 In TestMain()
2023/02/16 20:52:23 otp node fetch error: json: cannot unmarshal string into Go value of type couchbase.Pool
2023/02/16 20:52:23 Initialising services with role: kv,n1ql on node: 127.0.0.1:9000
2023/02/16 20:52:24 Initialising web UI on node: 127.0.0.1:9000
2023/02/16 20:52:24 InitWebCreds, response is: {"newBaseUri":"http://127.0.0.1:9000/"}
2023/02/16 20:52:24 Setting data quota of 1500M and Index quota of 1500M
2023/02/16 20:52:24 Adding serverGroup: Group 2 via server: 127.0.0.1:9000
2023/02/16 20:52:24 AddServerGroup: Successfully added serverGroup 127.0.0.1:9000, server: Group 2, response: []
2023/02/16 20:52:24 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 20:52:31 AddNodeWithServerGroup: Successfully added node: 127.0.0.1:9001 (role index, serverGroup: Group 2), response: {"otpNode":"n_1@127.0.0.1"}
2023/02/16 20:52:32 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 20:52:38 AddNodeWithServerGroup: Successfully added node: 127.0.0.1:9002 (role index, serverGroup: Group 1), response: {"otpNode":"n_2@127.0.0.1"}
2023/02/16 20:52:44 Rebalance progress: 0
2023/02/16 20:52:49 Rebalance progress: 100
2023/02/16 20:52:54 Created bucket default, responseBody: 
2023/02/16 20:52:59 Cluster status: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 20:52:59 Successfully initialised cluster
2023/02/16 20:52:59 Cluster status: map[127.0.0.1:9001:[index] 127.0.0.1:9002:[index] 172.31.5.112:9000:[kv n1ql]]
2023/02/16 20:52:59 Changing config key queryport.client.settings.backfillLimit to value 0
2023/02/16 20:52:59 Changing config key queryport.client.log_level to value Warn
2023/02/16 20:52:59 Changing config key indexer.api.enableTestServer to value true
2023/02/16 20:52:59 Changing config key indexer.settings.persisted_snapshot_init_build.moi.interval to value 60000
2023/02/16 20:52:59 Changing config key indexer.settings.persisted_snapshot.moi.interval to value 60000
2023/02/16 20:53:00 Changing config key indexer.settings.log_level to value info
2023/02/16 20:53:00 Changing config key indexer.settings.storage_mode.disable_upgrade to value true
2023/02/16 20:53:00 Changing config key indexer.settings.rebalance.blob_storage_scheme to value 
2023/02/16 20:53:00 Changing config key indexer.plasma.serverless.shardCopy.dbg to value true
2023/02/16 20:53:00 Changing config key indexer.rebalance.serverless.transferBatchSize to value 2
2023/02/16 20:53:00 Changing config key indexer.client_stats_refresh_interval to value 500
2023/02/16 20:53:00 Using plasma for creating indexes
2023/02/16 20:53:00 Changing config key indexer.settings.storage_mode to value plasma
2023/02/16 20:53:05 Data file exists. Skipping download
2023/02/16 20:53:05 Data file exists. Skipping download
2023/02/16 20:53:07 In DropAllSecondaryIndexes()
2023/02/16 20:53:07 Emptying the default bucket
2023/02/16 20:53:07 Deleted bucket default, responseBody: 
2023/02/16 20:53:07 http://127.0.0.1:9000/pools/default/buckets/bucket_1
2023/02/16 20:53:07 &{DELETE http://127.0.0.1:9000/pools/default/buckets/bucket_1 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000138000}
2023/02/16 20:53:07 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 15:23:06 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc0024a8840 31 [] false false map[] 0xc0002b6300 }
2023/02/16 20:53:07 DeleteBucket failed for bucket bucket_1 
2023/02/16 20:53:07 Deleted bucket bucket_1, responseBody: Requested resource not found.
2023/02/16 20:53:07 http://127.0.0.1:9000/pools/default/buckets/bucket_%!(NOVERB)
2023/02/16 20:53:07 &{DELETE http://127.0.0.1:9000/pools/default/buckets/bucket_%252 HTTP/1.1 1 1 map[Authorization:[Basic QWRtaW5pc3RyYXRvcjphc2Rhc2Q=] Content-Type:[application/x-www-form-urlencoded; charset=UTF-8]]   0 [] false 127.0.0.1:9000 map[] map[]  map[]      0xc000138000}
2023/02/16 20:53:07 &{404 Object Not Found 404 HTTP/1.1 1 1 map[Cache-Control:[no-cache,no-store,must-revalidate] Content-Length:[31] Content-Type:[text/plain] Date:[Thu, 16 Feb 2023 15:23:06 GMT] Expires:[Thu, 01 Jan 1970 00:00:00 GMT] Pragma:[no-cache] Server:[Couchbase Server] X-Content-Type-Options:[nosniff] X-Frame-Options:[DENY] X-Permitted-Cross-Domain-Policies:[none] X-Xss-Protection:[1; mode=block]] 0xc0024a89c0 31 [] false false map[] 0xc0001a2400 }
2023/02/16 20:53:07 DeleteBucket failed for bucket bucket_%2 
2023/02/16 20:53:07 Deleted bucket bucket_%2, responseBody: Requested resource not found.
2023/02/16 20:53:22 cleanupStorageDir: Cleaning up /opt/build/goproj/src/github.com/couchbase/indexing/secondary/tests/serverlesstests/shard_rebalance_storage_dir
=== RUN   TestIndexPlacement
2023/02/16 20:53:22 Created bucket bucket_1, responseBody: 
2023/02/16 20:53:22 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9000
2023/02/16 20:53:23 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9001
2023/02/16 20:53:23 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9002
2023/02/16 20:53:24 Created collection succeeded for bucket: bucket_1, scope: _default, collection: c1, body: {"uid":"2"}
2023/02/16 20:53:24 TestIndexPlacement: Manifest for bucket: bucket_1, scope: _default, collection: c1 is: map[uid:2]
2023/02/16 20:53:24 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9000, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:53:24 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9001, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:53:24 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9002, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:53:24 Received OK response from ensureManifest, bucket: bucket_1, uid: 2
2023/02/16 20:53:24 Executing N1ql statement: create index idx_1 on `bucket_1`.`_default`.`c1`(company)
2023/02/16 20:53:31 Index status is: Ready for index: idx_1, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:53:31 Index status is: Ready for index: idx_1 (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:53:33 scanIndexReplicas: Scanning all for index: idx_1, bucket: bucket_1, scope: _default, collection: c1
2023-02-16T20:53:33.744+05:30 [Info] creating GsiClient for 127.0.0.1:9000
2023/02/16 20:53:35 Deleted bucket bucket_1, responseBody: 
--- PASS: TestIndexPlacement (27.69s)
=== RUN   TestShardIdMapping
2023/02/16 20:53:50 Created bucket bucket_1, responseBody: 
2023/02/16 20:53:50 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9000
2023/02/16 20:53:51 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9001
2023/02/16 20:53:51 WaitForBucketCreation: Checking bucket(bucket_1) creation for host: 127.0.0.1:9002
2023/02/16 20:53:52 Executing N1ql statement: create index idx_secondary on `bucket_1`.`_default`.`_default`(age)
2023/02/16 20:54:01 Index status is: Ready for index: idx_secondary, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:01 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:01 Executing N1ql statement: create index idx_secondary_defer on `bucket_1`.`_default`.`_default`(company) with {"defer_build":true}
2023/02/16 20:54:05 Index status is: Created for index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:05 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:05 Executing N1ql statement: create primary index `#primary` on `bucket_1`.`_default`.`_default`
2023/02/16 20:54:10 Index status is: Ready for index: #primary, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:10 Index status is: Ready for index: #primary (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:10 Executing N1ql statement: create primary index `#primary_defer` on `bucket_1`.`_default`.`_default` with {"defer_build":true}
2023/02/16 20:54:16 Index status is: Created for index: #primary_defer, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:16 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:16 Executing N1ql statement: create index idx_partitioned on `bucket_1`.`_default`.`_default`(emailid) partition by hash(meta().id)
2023/02/16 20:54:25 Index status is: Ready for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:25 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:25 Executing N1ql statement: create index idx_partitioned_defer on `bucket_1`.`_default`.`_default`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:54:31 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:31 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:54:31 Created collection succeeded for bucket: bucket_1, scope: _default, collection: c1, body: {"uid":"2"}
2023/02/16 20:54:31 TestIndexPlacement: Manifest for bucket: bucket_1, scope: _default, collection: c1 is: map[uid:2]
2023/02/16 20:54:31 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9000, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:31 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9001, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:31 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9002, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:31 Received OK response from ensureManifest, bucket: bucket_1, uid: 2
2023/02/16 20:54:32 Executing N1ql statement: create index idx_secondary on `bucket_1`.`_default`.`c1`(age)
2023/02/16 20:54:41 Index status is: Ready for index: idx_secondary, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:41 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:41 Executing N1ql statement: create index idx_secondary_defer on `bucket_1`.`_default`.`c1`(company) with {"defer_build":true}
2023/02/16 20:54:46 Index status is: Created for index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:46 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:46 Executing N1ql statement: create primary index `#primary` on `bucket_1`.`_default`.`c1`
2023/02/16 20:54:50 Index status is: Ready for index: #primary, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:50 Index status is: Ready for index: #primary (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:50 Executing N1ql statement: create primary index `#primary_defer` on `bucket_1`.`_default`.`c1` with {"defer_build":true}
2023/02/16 20:54:56 Index status is: Created for index: #primary_defer, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:56 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:54:56 Executing N1ql statement: create index idx_partitioned on `bucket_1`.`_default`.`c1`(emailid) partition by hash(meta().id)
2023/02/16 20:55:06 Index status is: Ready for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:55:06 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:55:06 Executing N1ql statement: create index idx_partitioned_defer on `bucket_1`.`_default`.`c1`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:55:11 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:55:11 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:55:11 Created collection succeeded for bucket: bucket_1, scope: _default, collection: c2%, body: {"uid":"3"}
2023/02/16 20:55:11 TestIndexPlacement: Manifest for bucket: bucket_1, scope: _default, collection: c2% is: map[uid:3]
2023/02/16 20:55:11 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9000, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:11 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9001, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:11 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9002, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:11 Received OK response from ensureManifest, bucket: bucket_1, uid: 3
2023/02/16 20:55:12 Executing N1ql statement: create index idx_secondary on `bucket_1`.`_default`.`c2%`(age)
2023/02/16 20:55:21 Index status is: Ready for index: idx_secondary, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:21 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:21 Executing N1ql statement: create index idx_secondary_defer on `bucket_1`.`_default`.`c2%`(company) with {"defer_build":true}
2023/02/16 20:55:26 Index status is: Created for index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:26 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:26 Executing N1ql statement: create primary index `#primary` on `bucket_1`.`_default`.`c2%`
2023/02/16 20:55:35 Index status is: Ready for index: #primary, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:35 Index status is: Ready for index: #primary (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:35 Executing N1ql statement: create primary index `#primary_defer` on `bucket_1`.`_default`.`c2%` with {"defer_build":true}
2023/02/16 20:55:41 Index status is: Created for index: #primary_defer, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:41 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:41 Executing N1ql statement: create index idx_partitioned on `bucket_1`.`_default`.`c2%`(emailid) partition by hash(meta().id)
2023/02/16 20:55:50 Index status is: Ready for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:51 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:51 Executing N1ql statement: create index idx_partitioned_defer on `bucket_1`.`_default`.`c2%`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:55:56 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:56 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:55:56 Created bucket bucket_%2, responseBody: 
2023/02/16 20:55:56 WaitForBucketCreation: Checking bucket(bucket_%2) creation for host: 127.0.0.1:9000
2023/02/16 20:55:58 WaitForBucketCreation: Checking bucket(bucket_%2) creation for host: 127.0.0.1:9001
2023/02/16 20:55:58 WaitForBucketCreation: Checking bucket(bucket_%2) creation for host: 127.0.0.1:9002
2023/02/16 20:55:58 Executing N1ql statement: create index idx_secondary on `bucket_%2`.`_default`.`_default`(age)
2023/02/16 20:56:05 Index status is: Ready for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:05 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:05 Executing N1ql statement: create index idx_secondary_defer on `bucket_%2`.`_default`.`_default`(company) with {"defer_build":true}
2023/02/16 20:56:11 Index status is: Created for index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:11 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:11 Executing N1ql statement: create primary index `#primary` on `bucket_%2`.`_default`.`_default`
2023/02/16 20:56:20 Index status is: Ready for index: #primary, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:20 Index status is: Ready for index: #primary (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:20 Executing N1ql statement: create primary index `#primary_defer` on `bucket_%2`.`_default`.`_default` with {"defer_build":true}
2023/02/16 20:56:26 Index status is: Created for index: #primary_defer, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:26 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:26 Executing N1ql statement: create index idx_partitioned on `bucket_%2`.`_default`.`_default`(emailid) partition by hash(meta().id)
2023/02/16 20:56:36 Index status is: Ready for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:36 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:36 Executing N1ql statement: create index idx_partitioned_defer on `bucket_%2`.`_default`.`_default`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:56:40 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:40 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:56:40 Created collection succeeded for bucket: bucket_%2, scope: _default, collection: c1, body: {"uid":"2"}
2023/02/16 20:56:40 TestIndexPlacement: Manifest for bucket: bucket_%2, scope: _default, collection: c1 is: map[uid:2]
2023/02/16 20:56:40 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9000, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:41 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9001, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:41 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9002, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:41 Received OK response from ensureManifest, bucket: bucket_%2, uid: 2
2023/02/16 20:56:42 Executing N1ql statement: create index idx_secondary on `bucket_%2`.`_default`.`c1`(age)
2023/02/16 20:56:51 Index status is: Ready for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:51 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:51 Executing N1ql statement: create index idx_secondary_defer on `bucket_%2`.`_default`.`c1`(company) with {"defer_build":true}
2023/02/16 20:56:56 Index status is: Created for index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:56 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:56:56 Executing N1ql statement: create primary index `#primary` on `bucket_%2`.`_default`.`c1`
2023/02/16 20:57:05 Index status is: Ready for index: #primary, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:05 Index status is: Ready for index: #primary (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:05 Executing N1ql statement: create primary index `#primary_defer` on `bucket_%2`.`_default`.`c1` with {"defer_build":true}
2023/02/16 20:57:11 Index status is: Created for index: #primary_defer, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:11 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:11 Executing N1ql statement: create index idx_partitioned on `bucket_%2`.`_default`.`c1`(emailid) partition by hash(meta().id)
2023/02/16 20:57:15 Index status is: Ready for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:16 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:16 Executing N1ql statement: create index idx_partitioned_defer on `bucket_%2`.`_default`.`c1`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:57:21 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:21 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:57:21 Created collection succeeded for bucket: bucket_%2, scope: _default, collection: c2%, body: {"uid":"3"}
2023/02/16 20:57:21 TestIndexPlacement: Manifest for bucket: bucket_%2, scope: _default, collection: c2% is: map[uid:3]
2023/02/16 20:57:21 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9000, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:21 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9001, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:22 WaitForCollectionCreation: Checking collection creation for host: 127.0.0.1:9002, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:22 Received OK response from ensureManifest, bucket: bucket_%2, uid: 3
2023/02/16 20:57:23 Executing N1ql statement: create index idx_secondary on `bucket_%2`.`_default`.`c2%`(age)
2023/02/16 20:57:31 Index status is: Ready for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:31 Index status is: Ready for index: idx_secondary (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:31 Executing N1ql statement: create index idx_secondary_defer on `bucket_%2`.`_default`.`c2%`(company) with {"defer_build":true}
2023/02/16 20:57:36 Index status is: Created for index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:36 Index status is: Created for index: idx_secondary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:36 Executing N1ql statement: create primary index `#primary` on `bucket_%2`.`_default`.`c2%`
2023/02/16 20:57:46 Index status is: Ready for index: #primary, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:46 Index status is: Ready for index: #primary (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:46 Executing N1ql statement: create primary index `#primary_defer` on `bucket_%2`.`_default`.`c2%` with {"defer_build":true}
2023/02/16 20:57:51 Index status is: Created for index: #primary_defer, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:51 Index status is: Created for index: #primary_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:51 Executing N1ql statement: create index idx_partitioned on `bucket_%2`.`_default`.`c2%`(emailid) partition by hash(meta().id)
2023/02/16 20:57:56 Index status is: Ready for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:56 Index status is: Ready for index: idx_partitioned (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:57:56 Executing N1ql statement: create index idx_partitioned_defer on `bucket_%2`.`_default`.`c2%`(balance) partition by hash(meta().id)  with {"defer_build":true}
2023/02/16 20:58:00 Index status is: Created for index: idx_partitioned_defer, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:58:00 Index status is: Created for index: idx_partitioned_defer (replica 1), bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:58:03 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_1, scope: _default, collection: _default
2023-02-16T20:58:03.707+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:58:03.709+05:30 [Info] GSIC[default/bucket_1-_default-_default-1676561283705628777] started ...
2023/02/16 20:58:04 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:58:06 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 20:58:08 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:58:09 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:58:11 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 20:58:12 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_1, scope: _default, collection: c2%
2023-02-16T20:58:12.814+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:58:12.815+05:30 [Info] GSIC[default/bucket_1-_default-c2%-1676561292811877631] started ...
2023/02/16 20:58:14 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:58:15 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 20:58:17 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: _default
2023-02-16T20:58:17.295+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:58:17.296+05:30 [Info] GSIC[default/bucket_%2-_default-_default-1676561297292830135] started ...
2023/02/16 20:58:18 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:58:20 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 20:58:21 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c1
2023-02-16T20:58:21.559+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:58:21.560+05:30 [Info] GSIC[default/bucket_%2-_default-c1-1676561301557007276] started ...
2023/02/16 20:58:22 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:58:24 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 20:58:26 scanIndexReplicas: Scanning all for index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c2%
2023-02-16T20:58:26.014+05:30 [Info] GsiClient::UpdateUsecjson: using collatejson as data format between indexer and GsiClient
2023-02-16T20:58:26.015+05:30 [Info] GSIC[default/bucket_%2-_default-c2%-1676561306011965841] started ...
2023/02/16 20:58:27 scanIndexReplicas: Scanning all for index: #primary, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 20:58:28 scanIndexReplicas: Scanning all for index: idx_partitioned, bucket: bucket_%2, scope: _default, collection: c2%
--- PASS: TestShardIdMapping (279.77s)
=== RUN   TestShardRebalanceSetup
2023/02/16 20:58:30 In TestShardRebalanceSetup
2023/02/16 20:58:30 TestShardRebalanceSetup: Using  as storage dir for rebalance
2023/02/16 20:58:30 Changing config key indexer.settings.rebalance.blob_storage_bucket to value /opt/build/goproj/src/github.com/couchbase/indexing/secondary/tests/serverlesstests
2023/02/16 20:58:30 Changing config key indexer.settings.rebalance.blob_storage_prefix to value shard_rebalance_storage_dir
--- PASS: TestShardRebalanceSetup (0.23s)
=== RUN   TestTwoNodeSwapRebalance
2023/02/16 20:58:30 In TestTwoNodeSwapRebalance
2023/02/16 20:58:30 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 20:58:38 AddNodeWithServerGroup: Successfully added node: 127.0.0.1:9003 (role index, serverGroup: Group 2), response: {"otpNode":"n_3@127.0.0.1"}
2023/02/16 20:58:38 Adding node: https://127.0.0.1:19004 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 20:58:46 AddNodeWithServerGroup: Successfully added node: 127.0.0.1:9004 (role index, serverGroup: Group 1), response: {"otpNode":"n_4@127.0.0.1"}
2023/02/16 20:58:46 Removing node(s): [127.0.0.1:9001 127.0.0.1:9002] from the cluster
2023/02/16 20:58:52 Rebalance progress: 10
2023/02/16 20:58:57 Rebalance progress: 24
2023/02/16 20:59:02 Rebalance progress: 24
2023/02/16 20:59:07 Rebalance progress: 24
2023/02/16 20:59:12 Rebalance progress: 24
2023/02/16 20:59:17 Rebalance progress: 24
2023/02/16 20:59:22 Rebalance progress: 24
2023/02/16 20:59:27 Rebalance progress: 24
2023/02/16 20:59:32 Rebalance progress: 24
2023/02/16 20:59:37 Rebalance progress: 24
2023/02/16 20:59:42 Rebalance progress: 24
2023/02/16 20:59:47 Rebalance progress: 24
2023/02/16 20:59:52 Rebalance progress: 24
2023/02/16 20:59:57 Rebalance progress: 24
2023/02/16 21:00:02 Rebalance progress: 24
2023-02-16T21:00:04.946+05:30 [Error] receiving packet: read tcp 127.0.0.1:54656->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:04.946+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1977532795809096137) connection "127.0.0.1:54656" response transport failed `read tcp 127.0.0.1:54656->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:04.947+05:30 [Error] receiving packet: read tcp 127.0.0.1:40760->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:04.948+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1977532795809096137) connection "127.0.0.1:40760" response transport failed `read tcp 127.0.0.1:40760->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:04.950+05:30 [Error] receiving packet: read tcp 127.0.0.1:47308->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:04.950+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1977532795809096137) connection "127.0.0.1:47308" response transport failed `read tcp 127.0.0.1:47308->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:04.971+05:30 [Error] receiving packet: read tcp 127.0.0.1:47312->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:04.971+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4070029310123620438) connection "127.0.0.1:47312" response transport failed `read tcp 127.0.0.1:47312->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:04.989+05:30 [Error] receiving packet: read tcp 127.0.0.1:47314->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:04.989+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5642718202147260415) connection "127.0.0.1:47314" response transport failed `read tcp 127.0.0.1:47314->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:04.990+05:30 [Error] receiving packet: read tcp 127.0.0.1:32976->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:04.990+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5642718202147260415) connection "127.0.0.1:32976" response transport failed `read tcp 127.0.0.1:32976->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.000+05:30 [Error] receiving packet: read tcp 127.0.0.1:47318->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.000+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7083575613122006559) connection "127.0.0.1:47318" response transport failed `read tcp 127.0.0.1:47318->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.017+05:30 [Error] receiving packet: read tcp 127.0.0.1:47320->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.017+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8668536991178297019) connection "127.0.0.1:47320" response transport failed `read tcp 127.0.0.1:47320->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.043+05:30 [Error] receiving packet: read tcp 127.0.0.1:47322->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.043+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3281652505618327067) connection "127.0.0.1:47322" response transport failed `read tcp 127.0.0.1:47322->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.083+05:30 [Error] receiving packet: read tcp 127.0.0.1:47324->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.083+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4455889213846078836) connection "127.0.0.1:47324" response transport failed `read tcp 127.0.0.1:47324->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.083+05:30 [Error] receiving packet: read tcp 127.0.0.1:32982->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.083+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4455889213846078836) connection "127.0.0.1:32982" response transport failed `read tcp 127.0.0.1:32982->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.099+05:30 [Error] receiving packet: read tcp 127.0.0.1:32994->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.099+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1710578241048116923) connection "127.0.0.1:32994" response transport failed `read tcp 127.0.0.1:32994->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.099+05:30 [Error] receiving packet: read tcp 127.0.0.1:47326->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.099+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1710578241048116923) connection "127.0.0.1:47326" response transport failed `read tcp 127.0.0.1:47326->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.107+05:30 [Error] receiving packet: read tcp 127.0.0.1:32996->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.108+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(486764191574800356) connection "127.0.0.1:32996" response transport failed `read tcp 127.0.0.1:32996->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.134+05:30 [Error] receiving packet: read tcp 127.0.0.1:33000->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.134+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5812402124514074945) connection "127.0.0.1:33000" response transport failed `read tcp 127.0.0.1:33000->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.143+05:30 [Error] receiving packet: read tcp 127.0.0.1:33002->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.143+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8218861433253775294) connection "127.0.0.1:33002" response transport failed `read tcp 127.0.0.1:33002->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.143+05:30 [Error] receiving packet: read tcp 127.0.0.1:47332->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.143+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8218861433253775294) connection "127.0.0.1:47332" response transport failed `read tcp 127.0.0.1:47332->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.151+05:30 [Error] receiving packet: read tcp 127.0.0.1:33004->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.151+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5281458261364072213) connection "127.0.0.1:33004" response transport failed `read tcp 127.0.0.1:33004->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.154+05:30 [Error] receiving packet: read tcp 127.0.0.1:47340->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.154+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5281458261364072213) connection "127.0.0.1:47340" response transport failed `read tcp 127.0.0.1:47340->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.170+05:30 [Error] receiving packet: read tcp 127.0.0.1:47342->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.170+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5396218898768484053) connection "127.0.0.1:47342" response transport failed `read tcp 127.0.0.1:47342->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.172+05:30 [Error] receiving packet: read tcp 127.0.0.1:47346->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.173+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5396218898768484053) connection "127.0.0.1:47346" response transport failed `read tcp 127.0.0.1:47346->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.176+05:30 [Error] receiving packet: read tcp 127.0.0.1:47348->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.176+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5396218898768484053) connection "127.0.0.1:47348" response transport failed `read tcp 127.0.0.1:47348->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.177+05:30 [Error] receiving packet: read tcp 127.0.0.1:33010->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.177+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5396218898768484053) connection "127.0.0.1:33010" response transport failed `read tcp 127.0.0.1:33010->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.179+05:30 [Error] receiving packet: read tcp 127.0.0.1:47350->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.179+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5396218898768484053) connection "127.0.0.1:47350" response transport failed `read tcp 127.0.0.1:47350->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.209+05:30 [Error] receiving packet: read tcp 127.0.0.1:47354->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.209+05:30 [Error] receiving packet: read tcp 127.0.0.1:33018->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.209+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3315625522110025479) connection "127.0.0.1:47354" response transport failed `read tcp 127.0.0.1:47354->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.209+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3315625522110025479) connection "127.0.0.1:33018" response transport failed `read tcp 127.0.0.1:33018->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.212+05:30 [Error] receiving packet: read tcp 127.0.0.1:47356->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.212+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3315625522110025479) connection "127.0.0.1:47356" response transport failed `read tcp 127.0.0.1:47356->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.220+05:30 [Error] receiving packet: read tcp 127.0.0.1:33024->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.220+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3627617969263614507) connection "127.0.0.1:33024" response transport failed `read tcp 127.0.0.1:33024->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.227+05:30 [Error] receiving packet: read tcp 127.0.0.1:47360->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.227+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4943798725887502380) connection "127.0.0.1:47360" response transport failed `read tcp 127.0.0.1:47360->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.243+05:30 [Error] receiving packet: read tcp 127.0.0.1:33028->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.243+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-9117379083230243044) connection "127.0.0.1:33028" response transport failed `read tcp 127.0.0.1:33028->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.256+05:30 [Error] receiving packet: read tcp 127.0.0.1:47364->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.256+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7913032098424234211) connection "127.0.0.1:47364" response transport failed `read tcp 127.0.0.1:47364->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.264+05:30 [Error] receiving packet: read tcp 127.0.0.1:47368->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.264+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1913012880644403825) connection "127.0.0.1:47368" response transport failed `read tcp 127.0.0.1:47368->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.267+05:30 [Error] receiving packet: read tcp 127.0.0.1:47370->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.267+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1913012880644403825) connection "127.0.0.1:47370" response transport failed `read tcp 127.0.0.1:47370->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.269+05:30 [Error] receiving packet: read tcp 127.0.0.1:47372->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.269+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1913012880644403825) connection "127.0.0.1:47372" response transport failed `read tcp 127.0.0.1:47372->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.270+05:30 [Error] receiving packet: read tcp 127.0.0.1:33032->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.270+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1913012880644403825) connection "127.0.0.1:33032" response transport failed `read tcp 127.0.0.1:33032->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.273+05:30 [Error] receiving packet: read tcp 127.0.0.1:33040->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.274+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1913012880644403825) connection "127.0.0.1:33040" response transport failed `read tcp 127.0.0.1:33040->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.287+05:30 [Error] receiving packet: read tcp 127.0.0.1:33042->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.287+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6345127630024896298) connection "127.0.0.1:33042" response transport failed `read tcp 127.0.0.1:33042->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.298+05:30 [Error] receiving packet: read tcp 127.0.0.1:33046->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.298+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5174115500018247687) connection "127.0.0.1:33046" response transport failed `read tcp 127.0.0.1:33046->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.306+05:30 [Error] receiving packet: read tcp 127.0.0.1:47378->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.307+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(907820125259131774) connection "127.0.0.1:47378" response transport failed `read tcp 127.0.0.1:47378->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.308+05:30 [Error] receiving packet: read tcp 127.0.0.1:33048->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.308+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(907820125259131774) connection "127.0.0.1:33048" response transport failed `read tcp 127.0.0.1:33048->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.311+05:30 [Error] receiving packet: read tcp 127.0.0.1:47384->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.311+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(907820125259131774) connection "127.0.0.1:47384" response transport failed `read tcp 127.0.0.1:47384->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.313+05:30 [Error] receiving packet: read tcp 127.0.0.1:33052->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.313+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(907820125259131774) connection "127.0.0.1:33052" response transport failed `read tcp 127.0.0.1:33052->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.357+05:30 [Error] receiving packet: read tcp 127.0.0.1:47388->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.357+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2382161974486057278) connection "127.0.0.1:47388" response transport failed `read tcp 127.0.0.1:47388->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.358+05:30 [Error] receiving packet: read tcp 127.0.0.1:33056->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.358+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2382161974486057278) connection "127.0.0.1:33056" response transport failed `read tcp 127.0.0.1:33056->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.375+05:30 [Error] receiving packet: read tcp 127.0.0.1:33060->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.375+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7311034950388917650) connection "127.0.0.1:33060" response transport failed `read tcp 127.0.0.1:33060->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.380+05:30 [Error] receiving packet: read tcp 127.0.0.1:47392->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.380+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-767301775105406648) connection "127.0.0.1:47392" response transport failed `read tcp 127.0.0.1:47392->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.382+05:30 [Error] receiving packet: read tcp 127.0.0.1:47396->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.383+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-767301775105406648) connection "127.0.0.1:47396" response transport failed `read tcp 127.0.0.1:47396->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.386+05:30 [Error] receiving packet: read tcp 127.0.0.1:47398->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.386+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-767301775105406648) connection "127.0.0.1:47398" response transport failed `read tcp 127.0.0.1:47398->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.388+05:30 [Error] receiving packet: read tcp 127.0.0.1:47400->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.388+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-767301775105406648) connection "127.0.0.1:47400" response transport failed `read tcp 127.0.0.1:47400->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.392+05:30 [Error] receiving packet: read tcp 127.0.0.1:33068->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.392+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-767301775105406648) connection "127.0.0.1:33068" response transport failed `read tcp 127.0.0.1:33068->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.396+05:30 [Error] receiving packet: read tcp 127.0.0.1:47404->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.396+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-767301775105406648) connection "127.0.0.1:47404" response transport failed `read tcp 127.0.0.1:47404->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.398+05:30 [Error] receiving packet: read tcp 127.0.0.1:33072->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.398+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-767301775105406648) connection "127.0.0.1:33072" response transport failed `read tcp 127.0.0.1:33072->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.401+05:30 [Error] receiving packet: read tcp 127.0.0.1:33074->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.401+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-767301775105406648) connection "127.0.0.1:33074" response transport failed `read tcp 127.0.0.1:33074->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.414+05:30 [Error] receiving packet: read tcp 127.0.0.1:33078->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.414+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7253873489792847029) connection "127.0.0.1:33078" response transport failed `read tcp 127.0.0.1:33078->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.421+05:30 [Error] receiving packet: read tcp 127.0.0.1:47410->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.421+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1519400443684560239) connection "127.0.0.1:47410" response transport failed `read tcp 127.0.0.1:47410->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.453+05:30 [Error] receiving packet: read tcp 127.0.0.1:33080->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.453+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5783001096139484645) connection "127.0.0.1:33080" response transport failed `read tcp 127.0.0.1:33080->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.455+05:30 [Error] receiving packet: read tcp 127.0.0.1:47416->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.455+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5783001096139484645) connection "127.0.0.1:47416" response transport failed `read tcp 127.0.0.1:47416->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.503+05:30 [Error] receiving packet: read tcp 127.0.0.1:33090->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.503+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7097276007262087049) connection "127.0.0.1:33090" response transport failed `read tcp 127.0.0.1:33090->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.512+05:30 [Error] receiving packet: read tcp 127.0.0.1:33096->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.512+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7097276007262087049) connection "127.0.0.1:33096" response transport failed `read tcp 127.0.0.1:33096->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.565+05:30 [Error] receiving packet: read tcp 127.0.0.1:47426->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.565+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(554153396833545538) connection "127.0.0.1:47426" response transport failed `read tcp 127.0.0.1:47426->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.575+05:30 [Error] receiving packet: read tcp 127.0.0.1:33104->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.575+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(554153396833545538) connection "127.0.0.1:33104" response transport failed `read tcp 127.0.0.1:33104->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.579+05:30 [Error] receiving packet: read tcp 127.0.0.1:33106->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.579+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(554153396833545538) connection "127.0.0.1:33106" response transport failed `read tcp 127.0.0.1:33106->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.597+05:30 [Error] receiving packet: read tcp 127.0.0.1:33108->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.597+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4725141982270276429) connection "127.0.0.1:33108" response transport failed `read tcp 127.0.0.1:33108->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.601+05:30 [Error] receiving packet: read tcp 127.0.0.1:33110->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.601+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4725141982270276429) connection "127.0.0.1:33110" response transport failed `read tcp 127.0.0.1:33110->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.613+05:30 [Error] receiving packet: read tcp 127.0.0.1:33112->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.613+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3666310303685608043) connection "127.0.0.1:33112" response transport failed `read tcp 127.0.0.1:33112->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.615+05:30 [Error] receiving packet: read tcp 127.0.0.1:33114->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.615+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3666310303685608043) connection "127.0.0.1:33114" response transport failed `read tcp 127.0.0.1:33114->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.623+05:30 [Error] receiving packet: read tcp 127.0.0.1:33116->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.623+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3170526277499840622) connection "127.0.0.1:33116" response transport failed `read tcp 127.0.0.1:33116->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.651+05:30 [Error] receiving packet: read tcp 127.0.0.1:33118->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.651+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-50068080867962454) connection "127.0.0.1:33118" response transport failed `read tcp 127.0.0.1:33118->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.659+05:30 [Error] receiving packet: read tcp 127.0.0.1:33122->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.659+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5180687126896997618) connection "127.0.0.1:33122" response transport failed `read tcp 127.0.0.1:33122->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.666+05:30 [Error] receiving packet: read tcp 127.0.0.1:33124->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.667+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7254442274749042267) connection "127.0.0.1:33124" response transport failed `read tcp 127.0.0.1:33124->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.668+05:30 [Error] receiving packet: read tcp 127.0.0.1:47454->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.668+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7254442274749042267) connection "127.0.0.1:47454" response transport failed `read tcp 127.0.0.1:47454->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.684+05:30 [Error] receiving packet: read tcp 127.0.0.1:33128->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.684+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8733206324866324291) connection "127.0.0.1:33128" response transport failed `read tcp 127.0.0.1:33128->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.687+05:30 [Error] receiving packet: read tcp 127.0.0.1:33130->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.688+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8733206324866324291) connection "127.0.0.1:33130" response transport failed `read tcp 127.0.0.1:33130->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.720+05:30 [Error] receiving packet: read tcp 127.0.0.1:33132->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.720+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7020612030689360312) connection "127.0.0.1:33132" response transport failed `read tcp 127.0.0.1:33132->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.722+05:30 [Error] receiving packet: read tcp 127.0.0.1:33134->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.722+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7020612030689360312) connection "127.0.0.1:33134" response transport failed `read tcp 127.0.0.1:33134->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.735+05:30 [Error] receiving packet: read tcp 127.0.0.1:33136->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.735+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4825671389913777027) connection "127.0.0.1:33136" response transport failed `read tcp 127.0.0.1:33136->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.748+05:30 [Error] receiving packet: read tcp 127.0.0.1:33138->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.748+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8909770242310465489) connection "127.0.0.1:33138" response transport failed `read tcp 127.0.0.1:33138->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.748+05:30 [Error] receiving packet: read tcp 127.0.0.1:47460->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.748+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8909770242310465489) connection "127.0.0.1:47460" response transport failed `read tcp 127.0.0.1:47460->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.751+05:30 [Error] receiving packet: read tcp 127.0.0.1:47474->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.751+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8909770242310465489) connection "127.0.0.1:47474" response transport failed `read tcp 127.0.0.1:47474->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.761+05:30 [Error] receiving packet: read tcp 127.0.0.1:33144->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.761+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7174078406797639944) connection "127.0.0.1:33144" response transport failed `read tcp 127.0.0.1:33144->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.773+05:30 [Error] receiving packet: read tcp 127.0.0.1:33146->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.773+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2567016369233916774) connection "127.0.0.1:33146" response transport failed `read tcp 127.0.0.1:33146->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.777+05:30 [Error] receiving packet: read tcp 127.0.0.1:33148->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.777+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2567016369233916774) connection "127.0.0.1:33148" response transport failed `read tcp 127.0.0.1:33148->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.790+05:30 [Error] receiving packet: read tcp 127.0.0.1:33150->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.790+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-978131934852853819) connection "127.0.0.1:33150" response transport failed `read tcp 127.0.0.1:33150->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.792+05:30 [Error] receiving packet: read tcp 127.0.0.1:33152->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.792+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-978131934852853819) connection "127.0.0.1:33152" response transport failed `read tcp 127.0.0.1:33152->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.795+05:30 [Error] receiving packet: read tcp 127.0.0.1:33154->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.795+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-978131934852853819) connection "127.0.0.1:33154" response transport failed `read tcp 127.0.0.1:33154->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.827+05:30 [Error] receiving packet: read tcp 127.0.0.1:33156->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.827+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(240474231699609919) connection "127.0.0.1:33156" response transport failed `read tcp 127.0.0.1:33156->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.829+05:30 [Error] receiving packet: read tcp 127.0.0.1:33158->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.829+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(240474231699609919) connection "127.0.0.1:33158" response transport failed `read tcp 127.0.0.1:33158->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.837+05:30 [Error] receiving packet: read tcp 127.0.0.1:33160->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.837+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3923361482960462889) connection "127.0.0.1:33160" response transport failed `read tcp 127.0.0.1:33160->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.841+05:30 [Error] receiving packet: read tcp 127.0.0.1:47476->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.841+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3923361482960462889) connection "127.0.0.1:47476" response transport failed `read tcp 127.0.0.1:47476->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.841+05:30 [Error] receiving packet: read tcp 127.0.0.1:33162->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.841+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3923361482960462889) connection "127.0.0.1:33162" response transport failed `read tcp 127.0.0.1:33162->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.872+05:30 [Error] receiving packet: read tcp 127.0.0.1:33164->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.872+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6327963562530139646) connection "127.0.0.1:33164" response transport failed `read tcp 127.0.0.1:33164->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.873+05:30 [Error] receiving packet: read tcp 127.0.0.1:47500->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.873+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6327963562530139646) connection "127.0.0.1:47500" response transport failed `read tcp 127.0.0.1:47500->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.889+05:30 [Error] receiving packet: read tcp 127.0.0.1:47504->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.889+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5191600137006182748) connection "127.0.0.1:47504" response transport failed `read tcp 127.0.0.1:47504->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.889+05:30 [Error] receiving packet: read tcp 127.0.0.1:33168->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.889+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5191600137006182748) connection "127.0.0.1:33168" response transport failed `read tcp 127.0.0.1:33168->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.893+05:30 [Error] receiving packet: read tcp 127.0.0.1:33172->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.893+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5191600137006182748) connection "127.0.0.1:33172" response transport failed `read tcp 127.0.0.1:33172->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.902+05:30 [Error] receiving packet: read tcp 127.0.0.1:33174->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.902+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1826312486108031384) connection "127.0.0.1:33174" response transport failed `read tcp 127.0.0.1:33174->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.933+05:30 [Error] receiving packet: read tcp 127.0.0.1:47510->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.933+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2998170080546423390) connection "127.0.0.1:47510" response transport failed `read tcp 127.0.0.1:47510->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.941+05:30 [Error] receiving packet: read tcp 127.0.0.1:33178->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.941+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2544192019660732592) connection "127.0.0.1:33178" response transport failed `read tcp 127.0.0.1:33178->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.944+05:30 [Error] receiving packet: read tcp 127.0.0.1:33182->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:05.944+05:30 [Error] receiving packet: read tcp 127.0.0.1:47514->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.944+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2544192019660732592) connection "127.0.0.1:47514" response transport failed `read tcp 127.0.0.1:47514->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.944+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2544192019660732592) connection "127.0.0.1:33182" response transport failed `read tcp 127.0.0.1:33182->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:05.954+05:30 [Error] receiving packet: read tcp 127.0.0.1:47518->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.954+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8939363947328921663) connection "127.0.0.1:47518" response transport failed `read tcp 127.0.0.1:47518->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:05.956+05:30 [Error] receiving packet: read tcp 127.0.0.1:47520->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:05.956+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8939363947328921663) connection "127.0.0.1:47520" response transport failed `read tcp 127.0.0.1:47520->127.0.0.1:9113: i/o timeout`
2023/02/16 21:00:07 Rebalance progress: 24
2023-02-16T21:00:09.545+05:30 [Error] receiving packet: read tcp 127.0.0.1:47522->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.545+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7747306782386587643) connection "127.0.0.1:47522" response transport failed `read tcp 127.0.0.1:47522->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.553+05:30 [Error] receiving packet: read tcp 127.0.0.1:47626->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.553+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3357416505816390053) connection "127.0.0.1:47626" response transport failed `read tcp 127.0.0.1:47626->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.555+05:30 [Error] receiving packet: read tcp 127.0.0.1:47628->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.555+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3357416505816390053) connection "127.0.0.1:47628" response transport failed `read tcp 127.0.0.1:47628->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.564+05:30 [Error] receiving packet: read tcp 127.0.0.1:47630->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.564+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-9187451584812854773) connection "127.0.0.1:47630" response transport failed `read tcp 127.0.0.1:47630->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.579+05:30 [Error] receiving packet: read tcp 127.0.0.1:33216->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.579+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1376711308298703493) connection "127.0.0.1:33216" response transport failed `read tcp 127.0.0.1:33216->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.599+05:30 [Error] receiving packet: read tcp 127.0.0.1:33300->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.600+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4248831774137866735) connection "127.0.0.1:33300" response transport failed `read tcp 127.0.0.1:33300->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.601+05:30 [Error] receiving packet: read tcp 127.0.0.1:33302->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.601+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4248831774137866735) connection "127.0.0.1:33302" response transport failed `read tcp 127.0.0.1:33302->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.611+05:30 [Error] receiving packet: read tcp 127.0.0.1:47632->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.611+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8777113153645373494) connection "127.0.0.1:47632" response transport failed `read tcp 127.0.0.1:47632->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.620+05:30 [Error] receiving packet: read tcp 127.0.0.1:33304->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.620+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4535928498738946665) connection "127.0.0.1:33304" response transport failed `read tcp 127.0.0.1:33304->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.628+05:30 [Error] receiving packet: read tcp 127.0.0.1:33308->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.628+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1126485917531918392) connection "127.0.0.1:33308" response transport failed `read tcp 127.0.0.1:33308->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.629+05:30 [Error] receiving packet: read tcp 127.0.0.1:47640->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.629+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1126485917531918392) connection "127.0.0.1:47640" response transport failed `read tcp 127.0.0.1:47640->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.633+05:30 [Error] receiving packet: read tcp 127.0.0.1:47644->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.633+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1126485917531918392) connection "127.0.0.1:47644" response transport failed `read tcp 127.0.0.1:47644->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.634+05:30 [Error] receiving packet: read tcp 127.0.0.1:47646->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.634+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1126485917531918392) connection "127.0.0.1:47646" response transport failed `read tcp 127.0.0.1:47646->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.637+05:30 [Error] receiving packet: read tcp 127.0.0.1:33314->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.637+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1126485917531918392) connection "127.0.0.1:33314" response transport failed `read tcp 127.0.0.1:33314->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.669+05:30 [Error] receiving packet: read tcp 127.0.0.1:47650->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.669+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7260140046098318317) connection "127.0.0.1:47650" response transport failed `read tcp 127.0.0.1:47650->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.678+05:30 [Error] receiving packet: read tcp 127.0.0.1:47652->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.678+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2510155283100423610) connection "127.0.0.1:47652" response transport failed `read tcp 127.0.0.1:47652->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.702+05:30 [Error] receiving packet: read tcp 127.0.0.1:33320->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.702+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2908042272167565310) connection "127.0.0.1:33320" response transport failed `read tcp 127.0.0.1:33320->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.719+05:30 [Error] receiving packet: read tcp 127.0.0.1:33324->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.719+05:30 [Error] receiving packet: read tcp 127.0.0.1:47656->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.719+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6607100116434581290) connection "127.0.0.1:47656" response transport failed `read tcp 127.0.0.1:47656->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.719+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6607100116434581290) connection "127.0.0.1:33324" response transport failed `read tcp 127.0.0.1:33324->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.732+05:30 [Error] receiving packet: read tcp 127.0.0.1:47662->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.732+05:30 [Error] receiving packet: read tcp 127.0.0.1:33330->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.732+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5393255870297179787) connection "127.0.0.1:33330" response transport failed `read tcp 127.0.0.1:33330->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.732+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5393255870297179787) connection "127.0.0.1:47662" response transport failed `read tcp 127.0.0.1:47662->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.736+05:30 [Error] receiving packet: read tcp 127.0.0.1:33332->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.736+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5393255870297179787) connection "127.0.0.1:33332" response transport failed `read tcp 127.0.0.1:33332->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.739+05:30 [Error] receiving packet: read tcp 127.0.0.1:47668->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.739+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5393255870297179787) connection "127.0.0.1:47668" response transport failed `read tcp 127.0.0.1:47668->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.769+05:30 [Error] receiving packet: read tcp 127.0.0.1:47670->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.769+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(51231257183640341) connection "127.0.0.1:47670" response transport failed `read tcp 127.0.0.1:47670->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.770+05:30 [Error] receiving packet: read tcp 127.0.0.1:33338->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.770+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(51231257183640341) connection "127.0.0.1:33338" response transport failed `read tcp 127.0.0.1:33338->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.785+05:30 [Error] receiving packet: read tcp 127.0.0.1:33342->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.785+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8102613973844993892) connection "127.0.0.1:33342" response transport failed `read tcp 127.0.0.1:33342->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.793+05:30 [Error] receiving packet: read tcp 127.0.0.1:47674->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.793+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6970520749088935655) connection "127.0.0.1:47674" response transport failed `read tcp 127.0.0.1:47674->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.799+05:30 [Error] receiving packet: read tcp 127.0.0.1:33344->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.799+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-710201927483889740) connection "127.0.0.1:33344" response transport failed `read tcp 127.0.0.1:33344->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.807+05:30 [Error] receiving packet: read tcp 127.0.0.1:47680->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.808+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1890519833336620001) connection "127.0.0.1:47680" response transport failed `read tcp 127.0.0.1:47680->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.811+05:30 [Error] receiving packet: read tcp 127.0.0.1:33348->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.811+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1890519833336620001) connection "127.0.0.1:33348" response transport failed `read tcp 127.0.0.1:33348->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.813+05:30 [Error] receiving packet: read tcp 127.0.0.1:47684->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.813+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1890519833336620001) connection "127.0.0.1:47684" response transport failed `read tcp 127.0.0.1:47684->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.824+05:30 [Error] receiving packet: read tcp 127.0.0.1:47686->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.824+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6876694157710174505) connection "127.0.0.1:47686" response transport failed `read tcp 127.0.0.1:47686->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.827+05:30 [Error] receiving packet: read tcp 127.0.0.1:47688->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.827+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6876694157710174505) connection "127.0.0.1:47688" response transport failed `read tcp 127.0.0.1:47688->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.836+05:30 [Error] receiving packet: read tcp 127.0.0.1:47690->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.836+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3808032592454419211) connection "127.0.0.1:47690" response transport failed `read tcp 127.0.0.1:47690->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.890+05:30 [Error] receiving packet: read tcp 127.0.0.1:47692->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.891+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3808032592454419211) connection "127.0.0.1:47692" response transport failed `read tcp 127.0.0.1:47692->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.934+05:30 [Error] receiving packet: read tcp 127.0.0.1:47696->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.934+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1263865131206381493) connection "127.0.0.1:47696" response transport failed `read tcp 127.0.0.1:47696->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.946+05:30 [Error] receiving packet: read tcp 127.0.0.1:47698->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:09.946+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8125385479048980666) connection "127.0.0.1:47698" response transport failed `read tcp 127.0.0.1:47698->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:09.946+05:30 [Error] receiving packet: read tcp 127.0.0.1:33360->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.946+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8125385479048980666) connection "127.0.0.1:33360" response transport failed `read tcp 127.0.0.1:33360->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.965+05:30 [Error] receiving packet: read tcp 127.0.0.1:33366->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.965+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-858497725471092034) connection "127.0.0.1:33366" response transport failed `read tcp 127.0.0.1:33366->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.968+05:30 [Error] receiving packet: read tcp 127.0.0.1:33368->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.968+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-858497725471092034) connection "127.0.0.1:33368" response transport failed `read tcp 127.0.0.1:33368->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.980+05:30 [Error] receiving packet: read tcp 127.0.0.1:33370->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.980+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-858497725471092034) connection "127.0.0.1:33370" response transport failed `read tcp 127.0.0.1:33370->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.982+05:30 [Error] receiving packet: read tcp 127.0.0.1:33372->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.983+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-858497725471092034) connection "127.0.0.1:33372" response transport failed `read tcp 127.0.0.1:33372->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:09.997+05:30 [Error] receiving packet: read tcp 127.0.0.1:33374->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:09.997+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1296643175640345780) connection "127.0.0.1:33374" response transport failed `read tcp 127.0.0.1:33374->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.025+05:30 [Error] receiving packet: read tcp 127.0.0.1:47712->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.025+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(386798862173252498) connection "127.0.0.1:47712" response transport failed `read tcp 127.0.0.1:47712->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.041+05:30 [Error] receiving packet: read tcp 127.0.0.1:47714->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.041+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8173943030844788695) connection "127.0.0.1:47714" response transport failed `read tcp 127.0.0.1:47714->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.043+05:30 [Error] receiving packet: read tcp 127.0.0.1:33376->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.043+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8173943030844788695) connection "127.0.0.1:33376" response transport failed `read tcp 127.0.0.1:33376->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.092+05:30 [Error] receiving packet: read tcp 127.0.0.1:47718->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.092+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5219699560787322191) connection "127.0.0.1:47718" response transport failed `read tcp 127.0.0.1:47718->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.092+05:30 [Error] receiving packet: read tcp 127.0.0.1:33382->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.092+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5219699560787322191) connection "127.0.0.1:33382" response transport failed `read tcp 127.0.0.1:33382->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.103+05:30 [Error] receiving packet: read tcp 127.0.0.1:47722->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.103+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3640463043251321593) connection "127.0.0.1:47722" response transport failed `read tcp 127.0.0.1:47722->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.106+05:30 [Error] receiving packet: read tcp 127.0.0.1:47724->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.106+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3640463043251321593) connection "127.0.0.1:47724" response transport failed `read tcp 127.0.0.1:47724->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.109+05:30 [Error] receiving packet: read tcp 127.0.0.1:47726->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.109+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3640463043251321593) connection "127.0.0.1:47726" response transport failed `read tcp 127.0.0.1:47726->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.109+05:30 [Error] receiving packet: read tcp 127.0.0.1:33386->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.109+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3640463043251321593) connection "127.0.0.1:33386" response transport failed `read tcp 127.0.0.1:33386->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.112+05:30 [Error] receiving packet: read tcp 127.0.0.1:47728->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.112+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3640463043251321593) connection "127.0.0.1:47728" response transport failed `read tcp 127.0.0.1:47728->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.116+05:30 [Error] receiving packet: read tcp 127.0.0.1:47730->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.116+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3640463043251321593) connection "127.0.0.1:47730" response transport failed `read tcp 127.0.0.1:47730->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.127+05:30 [Error] receiving packet: read tcp 127.0.0.1:47734->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.127+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3018612949866443433) connection "127.0.0.1:47734" response transport failed `read tcp 127.0.0.1:47734->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.127+05:30 [Error] receiving packet: read tcp 127.0.0.1:33398->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.127+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3018612949866443433) connection "127.0.0.1:33398" response transport failed `read tcp 127.0.0.1:33398->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.135+05:30 [Error] receiving packet: read tcp 127.0.0.1:33402->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.135+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3807391880489158876) connection "127.0.0.1:33402" response transport failed `read tcp 127.0.0.1:33402->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.152+05:30 [Error] receiving packet: read tcp 127.0.0.1:47738->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.152+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2150883948829977559) connection "127.0.0.1:47738" response transport failed `read tcp 127.0.0.1:47738->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.158+05:30 [Error] receiving packet: read tcp 127.0.0.1:33406->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.158+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2900572353371536078) connection "127.0.0.1:33406" response transport failed `read tcp 127.0.0.1:33406->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.166+05:30 [Error] receiving packet: read tcp 127.0.0.1:33408->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.166+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1966195008852535072) connection "127.0.0.1:33408" response transport failed `read tcp 127.0.0.1:33408->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.168+05:30 [Error] receiving packet: read tcp 127.0.0.1:33410->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.168+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1966195008852535072) connection "127.0.0.1:33410" response transport failed `read tcp 127.0.0.1:33410->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.177+05:30 [Error] receiving packet: read tcp 127.0.0.1:47746->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.177+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7367782385714190670) connection "127.0.0.1:47746" response transport failed `read tcp 127.0.0.1:47746->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.187+05:30 [Error] receiving packet: read tcp 127.0.0.1:47750->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.187+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7992381510047560020) connection "127.0.0.1:47750" response transport failed `read tcp 127.0.0.1:47750->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.196+05:30 [Error] receiving packet: read tcp 127.0.0.1:47752->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.196+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6643068154554310036) connection "127.0.0.1:47752" response transport failed `read tcp 127.0.0.1:47752->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.199+05:30 [Error] receiving packet: read tcp 127.0.0.1:47754->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.199+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6643068154554310036) connection "127.0.0.1:47754" response transport failed `read tcp 127.0.0.1:47754->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.211+05:30 [Error] receiving packet: read tcp 127.0.0.1:33414->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.211+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7715457582812500053) connection "127.0.0.1:33414" response transport failed `read tcp 127.0.0.1:33414->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.234+05:30 [Error] receiving packet: read tcp 127.0.0.1:47756->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.234+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8264772012906668642) connection "127.0.0.1:47756" response transport failed `read tcp 127.0.0.1:47756->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.255+05:30 [Error] receiving packet: read tcp 127.0.0.1:47760->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.255+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8264772012906668642) connection "127.0.0.1:47760" response transport failed `read tcp 127.0.0.1:47760->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.279+05:30 [Error] receiving packet: read tcp 127.0.0.1:33424->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.279+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5601450658959586218) connection "127.0.0.1:33424" response transport failed `read tcp 127.0.0.1:33424->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.287+05:30 [Error] receiving packet: read tcp 127.0.0.1:33432->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.287+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7094331550558343581) connection "127.0.0.1:33432" response transport failed `read tcp 127.0.0.1:33432->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.306+05:30 [Error] receiving packet: read tcp 127.0.0.1:33434->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.306+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8510903442802029765) connection "127.0.0.1:33434" response transport failed `read tcp 127.0.0.1:33434->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.306+05:30 [Error] receiving packet: read tcp 127.0.0.1:47762->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.307+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8510903442802029765) connection "127.0.0.1:47762" response transport failed `read tcp 127.0.0.1:47762->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.315+05:30 [Error] receiving packet: read tcp 127.0.0.1:33436->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.315+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4826097346853285530) connection "127.0.0.1:33436" response transport failed `read tcp 127.0.0.1:33436->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.319+05:30 [Error] receiving packet: read tcp 127.0.0.1:47772->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.319+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4826097346853285530) connection "127.0.0.1:47772" response transport failed `read tcp 127.0.0.1:47772->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.323+05:30 [Error] receiving packet: read tcp 127.0.0.1:47774->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.323+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4826097346853285530) connection "127.0.0.1:47774" response transport failed `read tcp 127.0.0.1:47774->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.332+05:30 [Error] receiving packet: read tcp 127.0.0.1:47776->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.332+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-788456766003007120) connection "127.0.0.1:47776" response transport failed `read tcp 127.0.0.1:47776->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.334+05:30 [Error] receiving packet: read tcp 127.0.0.1:47778->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.334+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-788456766003007120) connection "127.0.0.1:47778" response transport failed `read tcp 127.0.0.1:47778->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.346+05:30 [Error] receiving packet: read tcp 127.0.0.1:47782->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.346+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-729680132127318525) connection "127.0.0.1:47782" response transport failed `read tcp 127.0.0.1:47782->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.355+05:30 [Error] receiving packet: read tcp 127.0.0.1:33446->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.355+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6293539761290278186) connection "127.0.0.1:33446" response transport failed `read tcp 127.0.0.1:33446->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.362+05:30 [Error] receiving packet: read tcp 127.0.0.1:33452->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.362+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4374168616725448621) connection "127.0.0.1:33452" response transport failed `read tcp 127.0.0.1:33452->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.369+05:30 [Error] receiving packet: read tcp 127.0.0.1:33454->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.369+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7053943456817439369) connection "127.0.0.1:33454" response transport failed `read tcp 127.0.0.1:33454->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.378+05:30 [Error] receiving packet: read tcp 127.0.0.1:33456->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.378+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-574854530634293087) connection "127.0.0.1:33456" response transport failed `read tcp 127.0.0.1:33456->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.416+05:30 [Error] receiving packet: read tcp 127.0.0.1:47784->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.416+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(9088765655663528511) connection "127.0.0.1:47784" response transport failed `read tcp 127.0.0.1:47784->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.423+05:30 [Error] receiving packet: read tcp 127.0.0.1:33458->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.423+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2682249246594870538) connection "127.0.0.1:33458" response transport failed `read tcp 127.0.0.1:33458->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.424+05:30 [Error] receiving packet: read tcp 127.0.0.1:47794->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.424+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2682249246594870538) connection "127.0.0.1:47794" response transport failed `read tcp 127.0.0.1:47794->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.427+05:30 [Error] receiving packet: read tcp 127.0.0.1:47796->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.427+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2682249246594870538) connection "127.0.0.1:47796" response transport failed `read tcp 127.0.0.1:47796->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.444+05:30 [Error] receiving packet: read tcp 127.0.0.1:33464->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.444+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6831381317802048923) connection "127.0.0.1:33464" response transport failed `read tcp 127.0.0.1:33464->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.458+05:30 [Error] receiving packet: read tcp 127.0.0.1:47800->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.458+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(293315281355819257) connection "127.0.0.1:47800" response transport failed `read tcp 127.0.0.1:47800->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.484+05:30 [Error] receiving packet: read tcp 127.0.0.1:33468->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.484+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7710968594359174619) connection "127.0.0.1:33468" response transport failed `read tcp 127.0.0.1:33468->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.487+05:30 [Error] receiving packet: read tcp 127.0.0.1:33478->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.487+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7710968594359174619) connection "127.0.0.1:33478" response transport failed `read tcp 127.0.0.1:33478->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.490+05:30 [Error] receiving packet: read tcp 127.0.0.1:33480->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.490+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7710968594359174619) connection "127.0.0.1:33480" response transport failed `read tcp 127.0.0.1:33480->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.491+05:30 [Error] receiving packet: read tcp 127.0.0.1:47806->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.491+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7710968594359174619) connection "127.0.0.1:47806" response transport failed `read tcp 127.0.0.1:47806->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.503+05:30 [Error] receiving packet: read tcp 127.0.0.1:33484->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.503+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4922648889871966133) connection "127.0.0.1:33484" response transport failed `read tcp 127.0.0.1:33484->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.505+05:30 [Error] receiving packet: read tcp 127.0.0.1:33486->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.505+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4922648889871966133) connection "127.0.0.1:33486" response transport failed `read tcp 127.0.0.1:33486->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.507+05:30 [Error] receiving packet: read tcp 127.0.0.1:33488->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:10.507+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4922648889871966133) connection "127.0.0.1:33488" response transport failed `read tcp 127.0.0.1:33488->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:10.520+05:30 [Error] receiving packet: read tcp 127.0.0.1:47816->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.520+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1691149391440183652) connection "127.0.0.1:47816" response transport failed `read tcp 127.0.0.1:47816->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:10.523+05:30 [Error] receiving packet: read tcp 127.0.0.1:47826->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:10.523+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1691149391440183652) connection "127.0.0.1:47826" response transport failed `read tcp 127.0.0.1:47826->127.0.0.1:9113: i/o timeout`
2023/02/16 21:00:12 Rebalance progress: 24
2023-02-16T21:00:14.113+05:30 [Error] receiving packet: read tcp 127.0.0.1:47828->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.113+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8107155847524079464) connection "127.0.0.1:47828" response transport failed `read tcp 127.0.0.1:47828->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.123+05:30 [Error] receiving packet: read tcp 127.0.0.1:47932->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.123+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(253398280663782474) connection "127.0.0.1:47932" response transport failed `read tcp 127.0.0.1:47932->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.132+05:30 [Error] receiving packet: read tcp 127.0.0.1:33490->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.132+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4894756127499028191) connection "127.0.0.1:33490" response transport failed `read tcp 127.0.0.1:33490->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.144+05:30 [Error] receiving packet: read tcp 127.0.0.1:47936->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.144+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4811560655790927938) connection "127.0.0.1:47936" response transport failed `read tcp 127.0.0.1:47936->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.152+05:30 [Error] receiving packet: read tcp 127.0.0.1:47938->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.152+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6853136352633872247) connection "127.0.0.1:47938" response transport failed `read tcp 127.0.0.1:47938->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.160+05:30 [Error] receiving packet: read tcp 127.0.0.1:47940->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.160+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(586173853553670778) connection "127.0.0.1:47940" response transport failed `read tcp 127.0.0.1:47940->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.163+05:30 [Error] receiving packet: read tcp 127.0.0.1:33600->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.163+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(586173853553670778) connection "127.0.0.1:33600" response transport failed `read tcp 127.0.0.1:33600->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.163+05:30 [Error] receiving packet: read tcp 127.0.0.1:47942->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.163+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(586173853553670778) connection "127.0.0.1:47942" response transport failed `read tcp 127.0.0.1:47942->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.171+05:30 [Error] receiving packet: read tcp 127.0.0.1:33610->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.172+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4592246377043516425) connection "127.0.0.1:33610" response transport failed `read tcp 127.0.0.1:33610->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.176+05:30 [Error] receiving packet: read tcp 127.0.0.1:33612->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.176+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4592246377043516425) connection "127.0.0.1:33612" response transport failed `read tcp 127.0.0.1:33612->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.192+05:30 [Error] receiving packet: read tcp 127.0.0.1:47950->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.192+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-495954591836898958) connection "127.0.0.1:47950" response transport failed `read tcp 127.0.0.1:47950->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.195+05:30 [Error] receiving packet: read tcp 127.0.0.1:47952->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.196+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-495954591836898958) connection "127.0.0.1:47952" response transport failed `read tcp 127.0.0.1:47952->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.205+05:30 [Error] receiving packet: read tcp 127.0.0.1:33614->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.205+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4208272614136956053) connection "127.0.0.1:33614" response transport failed `read tcp 127.0.0.1:33614->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.205+05:30 [Error] receiving packet: read tcp 127.0.0.1:47954->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.205+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4208272614136956053) connection "127.0.0.1:47954" response transport failed `read tcp 127.0.0.1:47954->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.209+05:30 [Error] receiving packet: read tcp 127.0.0.1:33622->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.209+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4208272614136956053) connection "127.0.0.1:33622" response transport failed `read tcp 127.0.0.1:33622->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.218+05:30 [Error] receiving packet: read tcp 127.0.0.1:47958->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.219+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6710313546738361666) connection "127.0.0.1:47958" response transport failed `read tcp 127.0.0.1:47958->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.228+05:30 [Error] receiving packet: read tcp 127.0.0.1:33626->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.228+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1146814383687662338) connection "127.0.0.1:33626" response transport failed `read tcp 127.0.0.1:33626->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.232+05:30 [Error] receiving packet: read tcp 127.0.0.1:47962->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.232+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1146814383687662338) connection "127.0.0.1:47962" response transport failed `read tcp 127.0.0.1:47962->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.254+05:30 [Error] receiving packet: read tcp 127.0.0.1:47966->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.254+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8147255453055980503) connection "127.0.0.1:47966" response transport failed `read tcp 127.0.0.1:47966->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.266+05:30 [Error] receiving packet: read tcp 127.0.0.1:33630->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.266+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6649370324086705676) connection "127.0.0.1:33630" response transport failed `read tcp 127.0.0.1:33630->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.275+05:30 [Error] receiving packet: read tcp 127.0.0.1:33636->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.275+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2138798919410795504) connection "127.0.0.1:33636" response transport failed `read tcp 127.0.0.1:33636->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.284+05:30 [Error] receiving packet: read tcp 127.0.0.1:33638->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.284+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7597513643718790112) connection "127.0.0.1:33638" response transport failed `read tcp 127.0.0.1:33638->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.289+05:30 [Error] receiving packet: read tcp 127.0.0.1:47968->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.289+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3320496211403332040) connection "127.0.0.1:47968" response transport failed `read tcp 127.0.0.1:47968->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.313+05:30 [Error] receiving packet: read tcp 127.0.0.1:47976->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.313+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3984442156002492785) connection "127.0.0.1:47976" response transport failed `read tcp 127.0.0.1:47976->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.313+05:30 [Error] receiving packet: read tcp 127.0.0.1:33640->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.313+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3984442156002492785) connection "127.0.0.1:33640" response transport failed `read tcp 127.0.0.1:33640->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.336+05:30 [Error] receiving packet: read tcp 127.0.0.1:47980->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.336+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4429446370485234535) connection "127.0.0.1:47980" response transport failed `read tcp 127.0.0.1:47980->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.336+05:30 [Error] receiving packet: read tcp 127.0.0.1:33644->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.336+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4429446370485234535) connection "127.0.0.1:33644" response transport failed `read tcp 127.0.0.1:33644->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.339+05:30 [Error] receiving packet: read tcp 127.0.0.1:33648->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.339+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4429446370485234535) connection "127.0.0.1:33648" response transport failed `read tcp 127.0.0.1:33648->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.351+05:30 [Error] receiving packet: read tcp 127.0.0.1:33652->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.351+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-216927660277464715) connection "127.0.0.1:33652" response transport failed `read tcp 127.0.0.1:33652->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.358+05:30 [Error] receiving packet: read tcp 127.0.0.1:47984->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.358+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1130256442576520363) connection "127.0.0.1:47984" response transport failed `read tcp 127.0.0.1:47984->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.358+05:30 [Error] receiving packet: read tcp 127.0.0.1:33654->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.358+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1130256442576520363) connection "127.0.0.1:33654" response transport failed `read tcp 127.0.0.1:33654->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.362+05:30 [Error] receiving packet: read tcp 127.0.0.1:33656->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.362+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1130256442576520363) connection "127.0.0.1:33656" response transport failed `read tcp 127.0.0.1:33656->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.372+05:30 [Error] receiving packet: read tcp 127.0.0.1:33660->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.372+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5837474464085397407) connection "127.0.0.1:33660" response transport failed `read tcp 127.0.0.1:33660->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.375+05:30 [Error] receiving packet: read tcp 127.0.0.1:33662->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.376+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5837474464085397407) connection "127.0.0.1:33662" response transport failed `read tcp 127.0.0.1:33662->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.378+05:30 [Error] receiving packet: read tcp 127.0.0.1:33664->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.378+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5837474464085397407) connection "127.0.0.1:33664" response transport failed `read tcp 127.0.0.1:33664->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.381+05:30 [Error] receiving packet: read tcp 127.0.0.1:33666->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.381+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5837474464085397407) connection "127.0.0.1:33666" response transport failed `read tcp 127.0.0.1:33666->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.387+05:30 [Error] receiving packet: read tcp 127.0.0.1:47992->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.387+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3905449269735239395) connection "127.0.0.1:47992" response transport failed `read tcp 127.0.0.1:47992->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.390+05:30 [Error] receiving packet: read tcp 127.0.0.1:48002->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.390+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3905449269735239395) connection "127.0.0.1:48002" response transport failed `read tcp 127.0.0.1:48002->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.393+05:30 [Error] receiving packet: read tcp 127.0.0.1:33670->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.393+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3905449269735239395) connection "127.0.0.1:33670" response transport failed `read tcp 127.0.0.1:33670->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.396+05:30 [Error] receiving packet: read tcp 127.0.0.1:33672->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.396+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3905449269735239395) connection "127.0.0.1:33672" response transport failed `read tcp 127.0.0.1:33672->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.398+05:30 [Error] receiving packet: read tcp 127.0.0.1:33674->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.398+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3905449269735239395) connection "127.0.0.1:33674" response transport failed `read tcp 127.0.0.1:33674->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.402+05:30 [Error] receiving packet: read tcp 127.0.0.1:48010->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.402+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3905449269735239395) connection "127.0.0.1:48010" response transport failed `read tcp 127.0.0.1:48010->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.404+05:30 [Error] receiving packet: read tcp 127.0.0.1:48012->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.404+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3905449269735239395) connection "127.0.0.1:48012" response transport failed `read tcp 127.0.0.1:48012->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.407+05:30 [Error] receiving packet: read tcp 127.0.0.1:48014->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.407+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3905449269735239395) connection "127.0.0.1:48014" response transport failed `read tcp 127.0.0.1:48014->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.410+05:30 [Error] receiving packet: read tcp 127.0.0.1:33682->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.410+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3905449269735239395) connection "127.0.0.1:33682" response transport failed `read tcp 127.0.0.1:33682->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.418+05:30 [Error] receiving packet: read tcp 127.0.0.1:48018->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.418+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5580781226323800754) connection "127.0.0.1:48018" response transport failed `read tcp 127.0.0.1:48018->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.430+05:30 [Error] receiving packet: read tcp 127.0.0.1:33688->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.430+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5095564371117343826) connection "127.0.0.1:33688" response transport failed `read tcp 127.0.0.1:33688->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.447+05:30 [Error] receiving packet: read tcp 127.0.0.1:33690->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.447+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6036175691523081559) connection "127.0.0.1:33690" response transport failed `read tcp 127.0.0.1:33690->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.474+05:30 [Error] receiving packet: read tcp 127.0.0.1:48020->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.474+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8346215225891361983) connection "127.0.0.1:48020" response transport failed `read tcp 127.0.0.1:48020->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.480+05:30 [Error] receiving packet: read tcp 127.0.0.1:33692->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.480+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2548392359304139051) connection "127.0.0.1:33692" response transport failed `read tcp 127.0.0.1:33692->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.482+05:30 [Error] receiving packet: read tcp 127.0.0.1:33694->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.482+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2548392359304139051) connection "127.0.0.1:33694" response transport failed `read tcp 127.0.0.1:33694->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.490+05:30 [Error] receiving packet: read tcp 127.0.0.1:33696->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.490+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(9208976522472040935) connection "127.0.0.1:33696" response transport failed `read tcp 127.0.0.1:33696->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.499+05:30 [Error] receiving packet: read tcp 127.0.0.1:48034->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.499+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1683386221844920054) connection "127.0.0.1:48034" response transport failed `read tcp 127.0.0.1:48034->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.516+05:30 [Error] receiving packet: read tcp 127.0.0.1:48036->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.517+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3774582769719579834) connection "127.0.0.1:48036" response transport failed `read tcp 127.0.0.1:48036->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.517+05:30 [Error] receiving packet: read tcp 127.0.0.1:33698->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.517+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3774582769719579834) connection "127.0.0.1:33698" response transport failed `read tcp 127.0.0.1:33698->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.520+05:30 [Error] receiving packet: read tcp 127.0.0.1:48038->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.520+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3774582769719579834) connection "127.0.0.1:48038" response transport failed `read tcp 127.0.0.1:48038->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.522+05:30 [Error] receiving packet: read tcp 127.0.0.1:48040->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.522+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3774582769719579834) connection "127.0.0.1:48040" response transport failed `read tcp 127.0.0.1:48040->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.531+05:30 [Error] receiving packet: read tcp 127.0.0.1:33708->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.531+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(534031663406432903) connection "127.0.0.1:33708" response transport failed `read tcp 127.0.0.1:33708->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.539+05:30 [Error] receiving packet: read tcp 127.0.0.1:48044->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.539+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8796930371603146758) connection "127.0.0.1:48044" response transport failed `read tcp 127.0.0.1:48044->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.547+05:30 [Error] receiving packet: read tcp 127.0.0.1:33712->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.547+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5813922191504337519) connection "127.0.0.1:33712" response transport failed `read tcp 127.0.0.1:33712->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.563+05:30 [Error] receiving packet: read tcp 127.0.0.1:48048->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.563+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2374502250044748248) connection "127.0.0.1:48048" response transport failed `read tcp 127.0.0.1:48048->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.566+05:30 [Error] receiving packet: read tcp 127.0.0.1:48052->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.566+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2374502250044748248) connection "127.0.0.1:48052" response transport failed `read tcp 127.0.0.1:48052->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.566+05:30 [Error] receiving packet: read tcp 127.0.0.1:33716->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.567+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2374502250044748248) connection "127.0.0.1:33716" response transport failed `read tcp 127.0.0.1:33716->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.575+05:30 [Error] receiving packet: read tcp 127.0.0.1:33720->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.575+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5699780418585226269) connection "127.0.0.1:33720" response transport failed `read tcp 127.0.0.1:33720->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.600+05:30 [Error] receiving packet: read tcp 127.0.0.1:33722->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.600+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5699780418585226269) connection "127.0.0.1:33722" response transport failed `read tcp 127.0.0.1:33722->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.604+05:30 [Error] receiving packet: read tcp 127.0.0.1:33724->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.604+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5699780418585226269) connection "127.0.0.1:33724" response transport failed `read tcp 127.0.0.1:33724->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.607+05:30 [Error] receiving packet: read tcp 127.0.0.1:33726->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.607+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5699780418585226269) connection "127.0.0.1:33726" response transport failed `read tcp 127.0.0.1:33726->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.645+05:30 [Error] receiving packet: read tcp 127.0.0.1:33730->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.645+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4603195469729961927) connection "127.0.0.1:33730" response transport failed `read tcp 127.0.0.1:33730->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.659+05:30 [Error] receiving packet: read tcp 127.0.0.1:48062->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.659+05:30 [Error] receiving packet: read tcp 127.0.0.1:33732->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.659+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1092630595973388593) connection "127.0.0.1:48062" response transport failed `read tcp 127.0.0.1:48062->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.659+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1092630595973388593) connection "127.0.0.1:33732" response transport failed `read tcp 127.0.0.1:33732->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.670+05:30 [Error] receiving packet: read tcp 127.0.0.1:33738->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.670+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6265350810815476945) connection "127.0.0.1:33738" response transport failed `read tcp 127.0.0.1:33738->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.677+05:30 [Error] receiving packet: read tcp 127.0.0.1:48068->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.677+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6562852053299213378) connection "127.0.0.1:48068" response transport failed `read tcp 127.0.0.1:48068->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.703+05:30 [Error] receiving packet: read tcp 127.0.0.1:33740->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.703+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3947865969121002940) connection "127.0.0.1:33740" response transport failed `read tcp 127.0.0.1:33740->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.715+05:30 [Error] receiving packet: read tcp 127.0.0.1:33744->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.715+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1847383559681543392) connection "127.0.0.1:33744" response transport failed `read tcp 127.0.0.1:33744->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.730+05:30 [Error] receiving packet: read tcp 127.0.0.1:33746->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.730+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4245341121151739849) connection "127.0.0.1:33746" response transport failed `read tcp 127.0.0.1:33746->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.730+05:30 [Error] receiving packet: read tcp 127.0.0.1:48076->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.730+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4245341121151739849) connection "127.0.0.1:48076" response transport failed `read tcp 127.0.0.1:48076->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.754+05:30 [Error] receiving packet: read tcp 127.0.0.1:33752->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.754+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4775444354207138807) connection "127.0.0.1:33752" response transport failed `read tcp 127.0.0.1:33752->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.754+05:30 [Error] receiving packet: read tcp 127.0.0.1:48082->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.755+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4775444354207138807) connection "127.0.0.1:48082" response transport failed `read tcp 127.0.0.1:48082->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.764+05:30 [Error] receiving packet: read tcp 127.0.0.1:33754->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.764+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4291504758759783118) connection "127.0.0.1:33754" response transport failed `read tcp 127.0.0.1:33754->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.792+05:30 [Error] receiving packet: read tcp 127.0.0.1:33758->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.792+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5937970439166883904) connection "127.0.0.1:33758" response transport failed `read tcp 127.0.0.1:33758->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.807+05:30 [Error] receiving packet: read tcp 127.0.0.1:33760->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.807+05:30 [Error] receiving packet: read tcp 127.0.0.1:48090->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.807+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4402453153837793401) connection "127.0.0.1:33760" response transport failed `read tcp 127.0.0.1:33760->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.807+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4402453153837793401) connection "127.0.0.1:48090" response transport failed `read tcp 127.0.0.1:48090->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.819+05:30 [Error] receiving packet: read tcp 127.0.0.1:48098->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.819+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8543761583638667437) connection "127.0.0.1:48098" response transport failed `read tcp 127.0.0.1:48098->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.822+05:30 [Error] receiving packet: read tcp 127.0.0.1:48100->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.822+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8543761583638667437) connection "127.0.0.1:48100" response transport failed `read tcp 127.0.0.1:48100->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.842+05:30 [Error] receiving packet: read tcp 127.0.0.1:33762->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.842+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1880704834430759274) connection "127.0.0.1:33762" response transport failed `read tcp 127.0.0.1:33762->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.857+05:30 [Error] receiving packet: read tcp 127.0.0.1:48102->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.857+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3651368027141536939) connection "127.0.0.1:48102" response transport failed `read tcp 127.0.0.1:48102->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.864+05:30 [Error] receiving packet: read tcp 127.0.0.1:33770->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.864+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3609615407793659265) connection "127.0.0.1:33770" response transport failed `read tcp 127.0.0.1:33770->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.872+05:30 [Error] receiving packet: read tcp 127.0.0.1:48106->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.872+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2233379748642966509) connection "127.0.0.1:48106" response transport failed `read tcp 127.0.0.1:48106->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.882+05:30 [Error] receiving packet: read tcp 127.0.0.1:33776->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.882+05:30 [Error] receiving packet: read tcp 127.0.0.1:48108->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.882+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2392489462278539905) connection "127.0.0.1:33776" response transport failed `read tcp 127.0.0.1:33776->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.882+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2392489462278539905) connection "127.0.0.1:48108" response transport failed `read tcp 127.0.0.1:48108->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.903+05:30 [Error] receiving packet: read tcp 127.0.0.1:33778->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.903+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6863567810654892964) connection "127.0.0.1:33778" response transport failed `read tcp 127.0.0.1:33778->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.911+05:30 [Error] receiving packet: read tcp 127.0.0.1:33780->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.911+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5767009050091418255) connection "127.0.0.1:33780" response transport failed `read tcp 127.0.0.1:33780->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.913+05:30 [Error] receiving packet: read tcp 127.0.0.1:33782->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.913+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5767009050091418255) connection "127.0.0.1:33782" response transport failed `read tcp 127.0.0.1:33782->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.925+05:30 [Error] receiving packet: read tcp 127.0.0.1:33784->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.925+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1721778587499134181) connection "127.0.0.1:33784" response transport failed `read tcp 127.0.0.1:33784->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.925+05:30 [Error] receiving packet: read tcp 127.0.0.1:48120->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.925+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1721778587499134181) connection "127.0.0.1:48120" response transport failed `read tcp 127.0.0.1:48120->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.928+05:30 [Error] receiving packet: read tcp 127.0.0.1:33788->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.928+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1721778587499134181) connection "127.0.0.1:33788" response transport failed `read tcp 127.0.0.1:33788->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.930+05:30 [Error] receiving packet: read tcp 127.0.0.1:48124->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:14.930+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1721778587499134181) connection "127.0.0.1:48124" response transport failed `read tcp 127.0.0.1:48124->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:14.949+05:30 [Error] receiving packet: read tcp 127.0.0.1:33792->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.949+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-411110702678042620) connection "127.0.0.1:33792" response transport failed `read tcp 127.0.0.1:33792->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.966+05:30 [Error] receiving packet: read tcp 127.0.0.1:33796->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.966+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6898210975946908798) connection "127.0.0.1:33796" response transport failed `read tcp 127.0.0.1:33796->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:14.982+05:30 [Error] receiving packet: read tcp 127.0.0.1:33798->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:14.982+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2200193155195170408) connection "127.0.0.1:33798" response transport failed `read tcp 127.0.0.1:33798->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:15.001+05:30 [Error] receiving packet: read tcp 127.0.0.1:33800->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:15.001+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4575482630913023742) connection "127.0.0.1:33800" response transport failed `read tcp 127.0.0.1:33800->127.0.0.1:9107: i/o timeout`
2023/02/16 21:00:17 Rebalance progress: 24
2023-02-16T21:00:18.562+05:30 [Error] receiving packet: read tcp 127.0.0.1:33814->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.562+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3749948484570804962) connection "127.0.0.1:33814" response transport failed `read tcp 127.0.0.1:33814->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.571+05:30 [Error] receiving packet: read tcp 127.0.0.1:33896->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.571+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1355842494270310596) connection "127.0.0.1:33896" response transport failed `read tcp 127.0.0.1:33896->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.574+05:30 [Error] receiving packet: read tcp 127.0.0.1:33898->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.574+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1355842494270310596) connection "127.0.0.1:33898" response transport failed `read tcp 127.0.0.1:33898->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.574+05:30 [Error] receiving packet: read tcp 127.0.0.1:48128->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.574+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1355842494270310596) connection "127.0.0.1:48128" response transport failed `read tcp 127.0.0.1:48128->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.578+05:30 [Error] receiving packet: read tcp 127.0.0.1:48234->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.578+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1355842494270310596) connection "127.0.0.1:48234" response transport failed `read tcp 127.0.0.1:48234->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.595+05:30 [Error] receiving packet: read tcp 127.0.0.1:33902->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.595+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1490440564245986968) connection "127.0.0.1:33902" response transport failed `read tcp 127.0.0.1:33902->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.599+05:30 [Error] receiving packet: read tcp 127.0.0.1:33904->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.599+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1490440564245986968) connection "127.0.0.1:33904" response transport failed `read tcp 127.0.0.1:33904->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.601+05:30 [Error] receiving packet: read tcp 127.0.0.1:33906->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.601+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1490440564245986968) connection "127.0.0.1:33906" response transport failed `read tcp 127.0.0.1:33906->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.610+05:30 [Error] receiving packet: read tcp 127.0.0.1:48242->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.610+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1499770686562298501) connection "127.0.0.1:48242" response transport failed `read tcp 127.0.0.1:48242->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.620+05:30 [Error] receiving packet: read tcp 127.0.0.1:33912->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.620+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4523608865728946306) connection "127.0.0.1:33912" response transport failed `read tcp 127.0.0.1:33912->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.629+05:30 [Error] receiving packet: read tcp 127.0.0.1:48244->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.629+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7547633076399306861) connection "127.0.0.1:48244" response transport failed `read tcp 127.0.0.1:48244->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.629+05:30 [Error] receiving packet: read tcp 127.0.0.1:33914->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.629+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7547633076399306861) connection "127.0.0.1:33914" response transport failed `read tcp 127.0.0.1:33914->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.637+05:30 [Error] receiving packet: read tcp 127.0.0.1:33916->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.637+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7972640637863281826) connection "127.0.0.1:33916" response transport failed `read tcp 127.0.0.1:33916->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.640+05:30 [Error] receiving packet: read tcp 127.0.0.1:48252->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.640+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7972640637863281826) connection "127.0.0.1:48252" response transport failed `read tcp 127.0.0.1:48252->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.643+05:30 [Error] receiving packet: read tcp 127.0.0.1:48254->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.643+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7972640637863281826) connection "127.0.0.1:48254" response transport failed `read tcp 127.0.0.1:48254->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.652+05:30 [Error] receiving packet: read tcp 127.0.0.1:33922->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.652+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(9027558390012344976) connection "127.0.0.1:33922" response transport failed `read tcp 127.0.0.1:33922->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.657+05:30 [Error] receiving packet: read tcp 127.0.0.1:48258->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.657+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(9027558390012344976) connection "127.0.0.1:48258" response transport failed `read tcp 127.0.0.1:48258->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.661+05:30 [Error] receiving packet: read tcp 127.0.0.1:33928->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.661+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(9027558390012344976) connection "127.0.0.1:33928" response transport failed `read tcp 127.0.0.1:33928->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.694+05:30 [Error] receiving packet: read tcp 127.0.0.1:33936->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.695+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7629023030216325065) connection "127.0.0.1:33936" response transport failed `read tcp 127.0.0.1:33936->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.701+05:30 [Error] receiving packet: read tcp 127.0.0.1:48266->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.701+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5454389261948859205) connection "127.0.0.1:48266" response transport failed `read tcp 127.0.0.1:48266->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.703+05:30 [Error] receiving packet: read tcp 127.0.0.1:48276->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.703+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5454389261948859205) connection "127.0.0.1:48276" response transport failed `read tcp 127.0.0.1:48276->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.715+05:30 [Error] receiving packet: read tcp 127.0.0.1:33946->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.715+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6179656253187919946) connection "127.0.0.1:33946" response transport failed `read tcp 127.0.0.1:33946->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.718+05:30 [Error] receiving packet: read tcp 127.0.0.1:33948->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.718+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6179656253187919946) connection "127.0.0.1:33948" response transport failed `read tcp 127.0.0.1:33948->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.718+05:30 [Error] receiving packet: read tcp 127.0.0.1:48278->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.718+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6179656253187919946) connection "127.0.0.1:48278" response transport failed `read tcp 127.0.0.1:48278->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.721+05:30 [Error] receiving packet: read tcp 127.0.0.1:33950->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.721+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6179656253187919946) connection "127.0.0.1:33950" response transport failed `read tcp 127.0.0.1:33950->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.742+05:30 [Error] receiving packet: read tcp 127.0.0.1:33952->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.743+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1853208573947945360) connection "127.0.0.1:33952" response transport failed `read tcp 127.0.0.1:33952->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.753+05:30 [Error] receiving packet: read tcp 127.0.0.1:48288->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.753+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4557207365738301424) connection "127.0.0.1:48288" response transport failed `read tcp 127.0.0.1:48288->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.756+05:30 [Error] receiving packet: read tcp 127.0.0.1:33956->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.756+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4557207365738301424) connection "127.0.0.1:33956" response transport failed `read tcp 127.0.0.1:33956->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.762+05:30 [Error] receiving packet: read tcp 127.0.0.1:48292->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.762+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4557207365738301424) connection "127.0.0.1:48292" response transport failed `read tcp 127.0.0.1:48292->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.765+05:30 [Error] receiving packet: read tcp 127.0.0.1:48298->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.766+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4557207365738301424) connection "127.0.0.1:48298" response transport failed `read tcp 127.0.0.1:48298->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.769+05:30 [Error] receiving packet: read tcp 127.0.0.1:33966->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.769+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4557207365738301424) connection "127.0.0.1:33966" response transport failed `read tcp 127.0.0.1:33966->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.782+05:30 [Error] receiving packet: read tcp 127.0.0.1:33972->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.782+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1678971775045052567) connection "127.0.0.1:33972" response transport failed `read tcp 127.0.0.1:33972->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.790+05:30 [Error] receiving packet: read tcp 127.0.0.1:33974->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.790+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7464717001414700285) connection "127.0.0.1:33974" response transport failed `read tcp 127.0.0.1:33974->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.814+05:30 [Error] receiving packet: read tcp 127.0.0.1:48304->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.815+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6922399723842077700) connection "127.0.0.1:48304" response transport failed `read tcp 127.0.0.1:48304->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.822+05:30 [Error] receiving packet: read tcp 127.0.0.1:48312->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.822+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6922399723842077700) connection "127.0.0.1:48312" response transport failed `read tcp 127.0.0.1:48312->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.834+05:30 [Error] receiving packet: read tcp 127.0.0.1:48316->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.834+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6463546328336782316) connection "127.0.0.1:48316" response transport failed `read tcp 127.0.0.1:48316->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.857+05:30 [Error] receiving packet: read tcp 127.0.0.1:33976->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.857+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3620539644528311080) connection "127.0.0.1:33976" response transport failed `read tcp 127.0.0.1:33976->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.865+05:30 [Error] receiving packet: read tcp 127.0.0.1:33986->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.865+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4439024914231150311) connection "127.0.0.1:33986" response transport failed `read tcp 127.0.0.1:33986->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.871+05:30 [Error] receiving packet: read tcp 127.0.0.1:48318->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.871+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4384351031296402113) connection "127.0.0.1:48318" response transport failed `read tcp 127.0.0.1:48318->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.879+05:30 [Error] receiving packet: read tcp 127.0.0.1:33988->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.879+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1496417009002836330) connection "127.0.0.1:33988" response transport failed `read tcp 127.0.0.1:33988->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.882+05:30 [Error] receiving packet: read tcp 127.0.0.1:33990->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.882+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1496417009002836330) connection "127.0.0.1:33990" response transport failed `read tcp 127.0.0.1:33990->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.885+05:30 [Error] receiving packet: read tcp 127.0.0.1:33992->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.885+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1496417009002836330) connection "127.0.0.1:33992" response transport failed `read tcp 127.0.0.1:33992->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.888+05:30 [Error] receiving packet: read tcp 127.0.0.1:48328->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.888+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1496417009002836330) connection "127.0.0.1:48328" response transport failed `read tcp 127.0.0.1:48328->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.891+05:30 [Error] receiving packet: read tcp 127.0.0.1:48330->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.891+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1496417009002836330) connection "127.0.0.1:48330" response transport failed `read tcp 127.0.0.1:48330->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.899+05:30 [Error] receiving packet: read tcp 127.0.0.1:48332->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.899+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7260369748914143690) connection "127.0.0.1:48332" response transport failed `read tcp 127.0.0.1:48332->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.902+05:30 [Error] receiving packet: read tcp 127.0.0.1:48334->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.902+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7260369748914143690) connection "127.0.0.1:48334" response transport failed `read tcp 127.0.0.1:48334->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.933+05:30 [Error] receiving packet: read tcp 127.0.0.1:34004->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.933+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7716399669302937293) connection "127.0.0.1:34004" response transport failed `read tcp 127.0.0.1:34004->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.933+05:30 [Error] receiving packet: read tcp 127.0.0.1:48336->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.933+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7716399669302937293) connection "127.0.0.1:48336" response transport failed `read tcp 127.0.0.1:48336->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.937+05:30 [Error] receiving packet: read tcp 127.0.0.1:48340->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.937+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7716399669302937293) connection "127.0.0.1:48340" response transport failed `read tcp 127.0.0.1:48340->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.940+05:30 [Error] receiving packet: read tcp 127.0.0.1:48342->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.940+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7716399669302937293) connection "127.0.0.1:48342" response transport failed `read tcp 127.0.0.1:48342->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.947+05:30 [Error] receiving packet: read tcp 127.0.0.1:34010->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.947+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3438871242435532262) connection "127.0.0.1:34010" response transport failed `read tcp 127.0.0.1:34010->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.950+05:30 [Error] receiving packet: read tcp 127.0.0.1:34012->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:18.950+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3438871242435532262) connection "127.0.0.1:34012" response transport failed `read tcp 127.0.0.1:34012->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:18.958+05:30 [Error] receiving packet: read tcp 127.0.0.1:48348->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.959+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5650112912684350830) connection "127.0.0.1:48348" response transport failed `read tcp 127.0.0.1:48348->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.961+05:30 [Error] receiving packet: read tcp 127.0.0.1:48350->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.961+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5650112912684350830) connection "127.0.0.1:48350" response transport failed `read tcp 127.0.0.1:48350->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.964+05:30 [Error] receiving packet: read tcp 127.0.0.1:48352->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.964+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5650112912684350830) connection "127.0.0.1:48352" response transport failed `read tcp 127.0.0.1:48352->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.968+05:30 [Error] receiving packet: read tcp 127.0.0.1:48354->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.968+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5650112912684350830) connection "127.0.0.1:48354" response transport failed `read tcp 127.0.0.1:48354->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.970+05:30 [Error] receiving packet: read tcp 127.0.0.1:48356->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.970+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5650112912684350830) connection "127.0.0.1:48356" response transport failed `read tcp 127.0.0.1:48356->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:18.999+05:30 [Error] receiving packet: read tcp 127.0.0.1:48360->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:18.999+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3428681141740896533) connection "127.0.0.1:48360" response transport failed `read tcp 127.0.0.1:48360->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.006+05:30 [Error] receiving packet: read tcp 127.0.0.1:34024->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.006+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6716595551575233309) connection "127.0.0.1:34024" response transport failed `read tcp 127.0.0.1:34024->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.022+05:30 [Error] receiving packet: read tcp 127.0.0.1:48364->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.022+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1101306685394295392) connection "127.0.0.1:48364" response transport failed `read tcp 127.0.0.1:48364->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.031+05:30 [Error] receiving packet: read tcp 127.0.0.1:48366->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.031+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6012011079079858735) connection "127.0.0.1:48366" response transport failed `read tcp 127.0.0.1:48366->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.035+05:30 [Error] receiving packet: read tcp 127.0.0.1:48368->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.035+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6012011079079858735) connection "127.0.0.1:48368" response transport failed `read tcp 127.0.0.1:48368->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.037+05:30 [Error] receiving packet: read tcp 127.0.0.1:48370->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.037+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6012011079079858735) connection "127.0.0.1:48370" response transport failed `read tcp 127.0.0.1:48370->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.040+05:30 [Error] receiving packet: read tcp 127.0.0.1:48372->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.040+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6012011079079858735) connection "127.0.0.1:48372" response transport failed `read tcp 127.0.0.1:48372->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.043+05:30 [Error] receiving packet: read tcp 127.0.0.1:48374->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.043+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6012011079079858735) connection "127.0.0.1:48374" response transport failed `read tcp 127.0.0.1:48374->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.055+05:30 [Error] receiving packet: read tcp 127.0.0.1:34028->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.055+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5111461553079936413) connection "127.0.0.1:34028" response transport failed `read tcp 127.0.0.1:34028->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.058+05:30 [Error] receiving packet: read tcp 127.0.0.1:48376->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.058+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5111461553079936413) connection "127.0.0.1:48376" response transport failed `read tcp 127.0.0.1:48376->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.061+05:30 [Error] receiving packet: read tcp 127.0.0.1:48378->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.061+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5111461553079936413) connection "127.0.0.1:48378" response transport failed `read tcp 127.0.0.1:48378->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.073+05:30 [Error] receiving packet: read tcp 127.0.0.1:48380->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.073+05:30 [Error] receiving packet: read tcp 127.0.0.1:34048->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.073+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(3160352388575475190) connection "127.0.0.1:48380" response transport failed `read tcp 127.0.0.1:48380->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.073+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3160352388575475190) connection "127.0.0.1:34048" response transport failed `read tcp 127.0.0.1:34048->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.084+05:30 [Error] receiving packet: read tcp 127.0.0.1:34052->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.084+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1355788709420951248) connection "127.0.0.1:34052" response transport failed `read tcp 127.0.0.1:34052->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.088+05:30 [Error] receiving packet: read tcp 127.0.0.1:34054->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.088+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-1355788709420951248) connection "127.0.0.1:34054" response transport failed `read tcp 127.0.0.1:34054->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.121+05:30 [Error] receiving packet: read tcp 127.0.0.1:48384->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.121+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2484222076521338390) connection "127.0.0.1:48384" response transport failed `read tcp 127.0.0.1:48384->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.130+05:30 [Error] receiving packet: read tcp 127.0.0.1:34056->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.130+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-400906106426105020) connection "127.0.0.1:34056" response transport failed `read tcp 127.0.0.1:34056->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.130+05:30 [Error] receiving packet: read tcp 127.0.0.1:48392->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.130+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-400906106426105020) connection "127.0.0.1:48392" response transport failed `read tcp 127.0.0.1:48392->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.142+05:30 [Error] receiving packet: read tcp 127.0.0.1:34062->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.142+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(7625030703606481935) connection "127.0.0.1:34062" response transport failed `read tcp 127.0.0.1:34062->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.150+05:30 [Error] receiving packet: read tcp 127.0.0.1:34064->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.150+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3157336762310452858) connection "127.0.0.1:34064" response transport failed `read tcp 127.0.0.1:34064->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.179+05:30 [Error] receiving packet: read tcp 127.0.0.1:48394->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.179+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7490870428362714589) connection "127.0.0.1:48394" response transport failed `read tcp 127.0.0.1:48394->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.215+05:30 [Error] receiving packet: read tcp 127.0.0.1:48402->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.215+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1650079444221165017) connection "127.0.0.1:48402" response transport failed `read tcp 127.0.0.1:48402->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.240+05:30 [Error] receiving packet: read tcp 127.0.0.1:48404->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.240+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8827926226613762766) connection "127.0.0.1:48404" response transport failed `read tcp 127.0.0.1:48404->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.242+05:30 [Error] receiving packet: read tcp 127.0.0.1:48406->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.242+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8827926226613762766) connection "127.0.0.1:48406" response transport failed `read tcp 127.0.0.1:48406->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.244+05:30 [Error] receiving packet: read tcp 127.0.0.1:48408->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.244+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8827926226613762766) connection "127.0.0.1:48408" response transport failed `read tcp 127.0.0.1:48408->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.248+05:30 [Error] receiving packet: read tcp 127.0.0.1:48410->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.248+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8827926226613762766) connection "127.0.0.1:48410" response transport failed `read tcp 127.0.0.1:48410->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.250+05:30 [Error] receiving packet: read tcp 127.0.0.1:48412->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.250+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8827926226613762766) connection "127.0.0.1:48412" response transport failed `read tcp 127.0.0.1:48412->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.258+05:30 [Error] receiving packet: read tcp 127.0.0.1:34066->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.258+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7614092341234926367) connection "127.0.0.1:34066" response transport failed `read tcp 127.0.0.1:34066->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.270+05:30 [Error] receiving packet: read tcp 127.0.0.1:48414->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.270+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5851467485994797645) connection "127.0.0.1:48414" response transport failed `read tcp 127.0.0.1:48414->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.278+05:30 [Error] receiving packet: read tcp 127.0.0.1:34084->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.278+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2534783400027442402) connection "127.0.0.1:34084" response transport failed `read tcp 127.0.0.1:34084->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.289+05:30 [Error] receiving packet: read tcp 127.0.0.1:48420->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.289+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4220626268410503459) connection "127.0.0.1:48420" response transport failed `read tcp 127.0.0.1:48420->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.292+05:30 [Error] receiving packet: read tcp 127.0.0.1:34088->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.292+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4220626268410503459) connection "127.0.0.1:34088" response transport failed `read tcp 127.0.0.1:34088->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.295+05:30 [Error] receiving packet: read tcp 127.0.0.1:34090->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.295+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4220626268410503459) connection "127.0.0.1:34090" response transport failed `read tcp 127.0.0.1:34090->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.323+05:30 [Error] receiving packet: read tcp 127.0.0.1:34094->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.323+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2438551087390561960) connection "127.0.0.1:34094" response transport failed `read tcp 127.0.0.1:34094->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.323+05:30 [Error] receiving packet: read tcp 127.0.0.1:48426->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.323+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2438551087390561960) connection "127.0.0.1:48426" response transport failed `read tcp 127.0.0.1:48426->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.334+05:30 [Error] receiving packet: read tcp 127.0.0.1:34098->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.334+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2621910067947260863) connection "127.0.0.1:34098" response transport failed `read tcp 127.0.0.1:34098->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.342+05:30 [Error] receiving packet: read tcp 127.0.0.1:34100->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.342+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8905660800807372691) connection "127.0.0.1:34100" response transport failed `read tcp 127.0.0.1:34100->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.346+05:30 [Error] receiving packet: read tcp 127.0.0.1:34102->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.346+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8905660800807372691) connection "127.0.0.1:34102" response transport failed `read tcp 127.0.0.1:34102->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.354+05:30 [Error] receiving packet: read tcp 127.0.0.1:34104->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.354+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2919117507646694470) connection "127.0.0.1:34104" response transport failed `read tcp 127.0.0.1:34104->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.356+05:30 [Error] receiving packet: read tcp 127.0.0.1:34106->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.356+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2919117507646694470) connection "127.0.0.1:34106" response transport failed `read tcp 127.0.0.1:34106->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.430+05:30 [Error] receiving packet: read tcp 127.0.0.1:48430->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.430+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2412515846049954628) connection "127.0.0.1:48430" response transport failed `read tcp 127.0.0.1:48430->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.435+05:30 [Error] receiving packet: read tcp 127.0.0.1:48444->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.435+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2412515846049954628) connection "127.0.0.1:48444" response transport failed `read tcp 127.0.0.1:48444->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.440+05:30 [Error] receiving packet: read tcp 127.0.0.1:48446->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.440+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2412515846049954628) connection "127.0.0.1:48446" response transport failed `read tcp 127.0.0.1:48446->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.440+05:30 [Error] receiving packet: read tcp 127.0.0.1:34108->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.440+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2412515846049954628) connection "127.0.0.1:34108" response transport failed `read tcp 127.0.0.1:34108->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.443+05:30 [Error] receiving packet: read tcp 127.0.0.1:48448->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.443+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2412515846049954628) connection "127.0.0.1:48448" response transport failed `read tcp 127.0.0.1:48448->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.447+05:30 [Error] receiving packet: read tcp 127.0.0.1:48450->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.447+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2412515846049954628) connection "127.0.0.1:48450" response transport failed `read tcp 127.0.0.1:48450->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.451+05:30 [Error] receiving packet: read tcp 127.0.0.1:34118->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.451+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2412515846049954628) connection "127.0.0.1:34118" response transport failed `read tcp 127.0.0.1:34118->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.461+05:30 [Error] receiving packet: read tcp 127.0.0.1:48456->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.461+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1101768995693291083) connection "127.0.0.1:48456" response transport failed `read tcp 127.0.0.1:48456->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.464+05:30 [Error] receiving packet: read tcp 127.0.0.1:48458->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.464+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1101768995693291083) connection "127.0.0.1:48458" response transport failed `read tcp 127.0.0.1:48458->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.466+05:30 [Error] receiving packet: read tcp 127.0.0.1:48460->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.467+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1101768995693291083) connection "127.0.0.1:48460" response transport failed `read tcp 127.0.0.1:48460->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.467+05:30 [Error] receiving packet: read tcp 127.0.0.1:34120->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.467+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1101768995693291083) connection "127.0.0.1:34120" response transport failed `read tcp 127.0.0.1:34120->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.470+05:30 [Error] receiving packet: read tcp 127.0.0.1:48462->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.470+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1101768995693291083) connection "127.0.0.1:48462" response transport failed `read tcp 127.0.0.1:48462->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.475+05:30 [Error] receiving packet: read tcp 127.0.0.1:34130->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.476+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1101768995693291083) connection "127.0.0.1:34130" response transport failed `read tcp 127.0.0.1:34130->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:19.487+05:30 [Error] receiving packet: read tcp 127.0.0.1:48468->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.487+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4169618255224983253) connection "127.0.0.1:48468" response transport failed `read tcp 127.0.0.1:48468->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.489+05:30 [Error] receiving packet: read tcp 127.0.0.1:48470->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.489+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4169618255224983253) connection "127.0.0.1:48470" response transport failed `read tcp 127.0.0.1:48470->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.492+05:30 [Error] receiving packet: read tcp 127.0.0.1:48472->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.492+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4169618255224983253) connection "127.0.0.1:48472" response transport failed `read tcp 127.0.0.1:48472->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.495+05:30 [Error] receiving packet: read tcp 127.0.0.1:48474->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.495+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4169618255224983253) connection "127.0.0.1:48474" response transport failed `read tcp 127.0.0.1:48474->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.498+05:30 [Error] receiving packet: read tcp 127.0.0.1:48476->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:19.498+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4169618255224983253) connection "127.0.0.1:48476" response transport failed `read tcp 127.0.0.1:48476->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:19.522+05:30 [Error] receiving packet: read tcp 127.0.0.1:34132->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:19.522+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2131092103216903908) connection "127.0.0.1:34132" response transport failed `read tcp 127.0.0.1:34132->127.0.0.1:9107: i/o timeout`
2023/02/16 21:00:22 Rebalance progress: 24
2023-02-16T21:00:22.893+05:30 [Error] receiving packet: read tcp 127.0.0.1:34146->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.893+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-9093575842646876962) connection "127.0.0.1:34146" response transport failed `read tcp 127.0.0.1:34146->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.893+05:30 [Error] receiving packet: read tcp 127.0.0.1:48478->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.893+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-9093575842646876962) connection "127.0.0.1:48478" response transport failed `read tcp 127.0.0.1:48478->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.901+05:30 [Error] receiving packet: read tcp 127.0.0.1:34228->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.901+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-9093575842646876962) connection "127.0.0.1:34228" response transport failed `read tcp 127.0.0.1:34228->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.903+05:30 [Error] receiving packet: read tcp 127.0.0.1:48564->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.903+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-9093575842646876962) connection "127.0.0.1:48564" response transport failed `read tcp 127.0.0.1:48564->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.906+05:30 [Error] receiving packet: read tcp 127.0.0.1:34232->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.906+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-9093575842646876962) connection "127.0.0.1:34232" response transport failed `read tcp 127.0.0.1:34232->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.909+05:30 [Error] receiving packet: read tcp 127.0.0.1:48568->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.909+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-9093575842646876962) connection "127.0.0.1:48568" response transport failed `read tcp 127.0.0.1:48568->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.912+05:30 [Error] receiving packet: read tcp 127.0.0.1:48570->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.912+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-9093575842646876962) connection "127.0.0.1:48570" response transport failed `read tcp 127.0.0.1:48570->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.924+05:30 [Error] receiving packet: read tcp 127.0.0.1:48574->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.924+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4294856652054452085) connection "127.0.0.1:48574" response transport failed `read tcp 127.0.0.1:48574->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.934+05:30 [Error] receiving packet: read tcp 127.0.0.1:48576->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.934+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-1522257306547480545) connection "127.0.0.1:48576" response transport failed `read tcp 127.0.0.1:48576->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.965+05:30 [Error] receiving packet: read tcp 127.0.0.1:34238->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.965+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5625988605319563575) connection "127.0.0.1:34238" response transport failed `read tcp 127.0.0.1:34238->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.968+05:30 [Error] receiving packet: read tcp 127.0.0.1:34246->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.968+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5625988605319563575) connection "127.0.0.1:34246" response transport failed `read tcp 127.0.0.1:34246->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.971+05:30 [Error] receiving packet: read tcp 127.0.0.1:34248->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:22.971+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5625988605319563575) connection "127.0.0.1:34248" response transport failed `read tcp 127.0.0.1:34248->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:22.978+05:30 [Error] receiving packet: read tcp 127.0.0.1:48578->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.978+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4598274359787695045) connection "127.0.0.1:48578" response transport failed `read tcp 127.0.0.1:48578->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:22.996+05:30 [Error] receiving packet: read tcp 127.0.0.1:48586->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:22.996+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4660725867036215862) connection "127.0.0.1:48586" response transport failed `read tcp 127.0.0.1:48586->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.002+05:30 [Error] receiving packet: read tcp 127.0.0.1:34250->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.002+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4471049539473984474) connection "127.0.0.1:34250" response transport failed `read tcp 127.0.0.1:34250->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.028+05:30 [Error] receiving packet: read tcp 127.0.0.1:34254->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.028+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3294305331399772839) connection "127.0.0.1:34254" response transport failed `read tcp 127.0.0.1:34254->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.029+05:30 [Error] receiving packet: read tcp 127.0.0.1:48592->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.029+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3294305331399772839) connection "127.0.0.1:48592" response transport failed `read tcp 127.0.0.1:48592->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.038+05:30 [Error] receiving packet: read tcp 127.0.0.1:34260->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.038+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3294305331399772839) connection "127.0.0.1:34260" response transport failed `read tcp 127.0.0.1:34260->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.049+05:30 [Error] receiving packet: read tcp 127.0.0.1:48598->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.049+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4933777428925745911) connection "127.0.0.1:48598" response transport failed `read tcp 127.0.0.1:48598->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.059+05:30 [Error] receiving packet: read tcp 127.0.0.1:48600->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.059+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6064053898603193871) connection "127.0.0.1:48600" response transport failed `read tcp 127.0.0.1:48600->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.059+05:30 [Error] receiving packet: read tcp 127.0.0.1:34262->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.059+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6064053898603193871) connection "127.0.0.1:34262" response transport failed `read tcp 127.0.0.1:34262->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.076+05:30 [Error] receiving packet: read tcp 127.0.0.1:34272->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.077+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2394917600766104826) connection "127.0.0.1:34272" response transport failed `read tcp 127.0.0.1:34272->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.080+05:30 [Error] receiving packet: read tcp 127.0.0.1:34274->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.080+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2394917600766104826) connection "127.0.0.1:34274" response transport failed `read tcp 127.0.0.1:34274->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.083+05:30 [Error] receiving packet: read tcp 127.0.0.1:34276->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.083+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2394917600766104826) connection "127.0.0.1:34276" response transport failed `read tcp 127.0.0.1:34276->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.092+05:30 [Error] receiving packet: read tcp 127.0.0.1:48604->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.092+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5014175148094199510) connection "127.0.0.1:48604" response transport failed `read tcp 127.0.0.1:48604->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.095+05:30 [Error] receiving packet: read tcp 127.0.0.1:48614->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.095+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5014175148094199510) connection "127.0.0.1:48614" response transport failed `read tcp 127.0.0.1:48614->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.102+05:30 [Error] receiving packet: read tcp 127.0.0.1:34278->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.102+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6756217320314885644) connection "127.0.0.1:34278" response transport failed `read tcp 127.0.0.1:34278->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.104+05:30 [Error] receiving packet: read tcp 127.0.0.1:48616->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.104+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6756217320314885644) connection "127.0.0.1:48616" response transport failed `read tcp 127.0.0.1:48616->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.107+05:30 [Error] receiving packet: read tcp 127.0.0.1:48618->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.107+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6756217320314885644) connection "127.0.0.1:48618" response transport failed `read tcp 127.0.0.1:48618->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.110+05:30 [Error] receiving packet: read tcp 127.0.0.1:34286->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.110+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6756217320314885644) connection "127.0.0.1:34286" response transport failed `read tcp 127.0.0.1:34286->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.136+05:30 [Error] receiving packet: read tcp 127.0.0.1:34288->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.136+05:30 [Error] receiving packet: read tcp 127.0.0.1:48624->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.136+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8677254600485427409) connection "127.0.0.1:34288" response transport failed `read tcp 127.0.0.1:34288->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.136+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8677254600485427409) connection "127.0.0.1:48624" response transport failed `read tcp 127.0.0.1:48624->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.144+05:30 [Error] receiving packet: read tcp 127.0.0.1:34292->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.144+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-289493664203849847) connection "127.0.0.1:34292" response transport failed `read tcp 127.0.0.1:34292->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.157+05:30 [Error] receiving packet: read tcp 127.0.0.1:48628->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.158+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4087204022479749112) connection "127.0.0.1:48628" response transport failed `read tcp 127.0.0.1:48628->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.168+05:30 [Error] receiving packet: read tcp 127.0.0.1:34298->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.168+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-96540044989569302) connection "127.0.0.1:34298" response transport failed `read tcp 127.0.0.1:34298->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.172+05:30 [Error] receiving packet: read tcp 127.0.0.1:48630->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.172+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2019730353072831713) connection "127.0.0.1:48630" response transport failed `read tcp 127.0.0.1:48630->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.176+05:30 [Error] receiving packet: read tcp 127.0.0.1:48634->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.176+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2019730353072831713) connection "127.0.0.1:48634" response transport failed `read tcp 127.0.0.1:48634->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.192+05:30 [Error] receiving packet: read tcp 127.0.0.1:34302->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.192+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4460347126497981226) connection "127.0.0.1:34302" response transport failed `read tcp 127.0.0.1:34302->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.198+05:30 [Error] receiving packet: read tcp 127.0.0.1:48638->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.198+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1816408398199894282) connection "127.0.0.1:48638" response transport failed `read tcp 127.0.0.1:48638->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.212+05:30 [Error] receiving packet: read tcp 127.0.0.1:48642->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.212+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6404018220144136946) connection "127.0.0.1:48642" response transport failed `read tcp 127.0.0.1:48642->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.236+05:30 [Error] receiving packet: read tcp 127.0.0.1:48644->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.236+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1672105423956227757) connection "127.0.0.1:48644" response transport failed `read tcp 127.0.0.1:48644->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.236+05:30 [Error] receiving packet: read tcp 127.0.0.1:34306->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.236+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1672105423956227757) connection "127.0.0.1:34306" response transport failed `read tcp 127.0.0.1:34306->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.239+05:30 [Error] receiving packet: read tcp 127.0.0.1:48646->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.239+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1672105423956227757) connection "127.0.0.1:48646" response transport failed `read tcp 127.0.0.1:48646->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.243+05:30 [Error] receiving packet: read tcp 127.0.0.1:34314->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.243+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1672105423956227757) connection "127.0.0.1:34314" response transport failed `read tcp 127.0.0.1:34314->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.245+05:30 [Error] receiving packet: read tcp 127.0.0.1:34316->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.245+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1672105423956227757) connection "127.0.0.1:34316" response transport failed `read tcp 127.0.0.1:34316->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.261+05:30 [Error] receiving packet: read tcp 127.0.0.1:48654->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.261+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4485015274241426613) connection "127.0.0.1:48654" response transport failed `read tcp 127.0.0.1:48654->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.262+05:30 [Error] receiving packet: read tcp 127.0.0.1:34318->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.262+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4485015274241426613) connection "127.0.0.1:34318" response transport failed `read tcp 127.0.0.1:34318->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.265+05:30 [Error] receiving packet: read tcp 127.0.0.1:34322->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.265+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4485015274241426613) connection "127.0.0.1:34322" response transport failed `read tcp 127.0.0.1:34322->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.268+05:30 [Error] receiving packet: read tcp 127.0.0.1:34324->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.268+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4485015274241426613) connection "127.0.0.1:34324" response transport failed `read tcp 127.0.0.1:34324->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.277+05:30 [Error] receiving packet: read tcp 127.0.0.1:48660->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.277+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5469636339918204445) connection "127.0.0.1:48660" response transport failed `read tcp 127.0.0.1:48660->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.280+05:30 [Error] receiving packet: read tcp 127.0.0.1:48662->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.280+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5469636339918204445) connection "127.0.0.1:48662" response transport failed `read tcp 127.0.0.1:48662->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.288+05:30 [Error] receiving packet: read tcp 127.0.0.1:34330->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.288+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1437279950161071098) connection "127.0.0.1:34330" response transport failed `read tcp 127.0.0.1:34330->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.291+05:30 [Error] receiving packet: read tcp 127.0.0.1:48666->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.291+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1437279950161071098) connection "127.0.0.1:48666" response transport failed `read tcp 127.0.0.1:48666->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.295+05:30 [Error] receiving packet: read tcp 127.0.0.1:34334->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.295+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1437279950161071098) connection "127.0.0.1:34334" response transport failed `read tcp 127.0.0.1:34334->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.299+05:30 [Error] receiving packet: read tcp 127.0.0.1:34336->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.299+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1437279950161071098) connection "127.0.0.1:34336" response transport failed `read tcp 127.0.0.1:34336->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.335+05:30 [Error] receiving packet: read tcp 127.0.0.1:48672->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.336+05:30 [Error] receiving packet: read tcp 127.0.0.1:34340->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.336+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2320921195721541500) connection "127.0.0.1:48672" response transport failed `read tcp 127.0.0.1:48672->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.336+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2320921195721541500) connection "127.0.0.1:34340" response transport failed `read tcp 127.0.0.1:34340->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.348+05:30 [Error] receiving packet: read tcp 127.0.0.1:48678->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.348+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2886244517293985493) connection "127.0.0.1:48678" response transport failed `read tcp 127.0.0.1:48678->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.352+05:30 [Error] receiving packet: read tcp 127.0.0.1:48682->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.352+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2886244517293985493) connection "127.0.0.1:48682" response transport failed `read tcp 127.0.0.1:48682->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.373+05:30 [Error] receiving packet: read tcp 127.0.0.1:34350->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.373+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7055816512880610232) connection "127.0.0.1:34350" response transport failed `read tcp 127.0.0.1:34350->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.393+05:30 [Error] receiving packet: read tcp 127.0.0.1:48688->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.393+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5243256402013287547) connection "127.0.0.1:48688" response transport failed `read tcp 127.0.0.1:48688->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.393+05:30 [Error] receiving packet: read tcp 127.0.0.1:34352->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.393+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-5243256402013287547) connection "127.0.0.1:34352" response transport failed `read tcp 127.0.0.1:34352->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.396+05:30 [Error] receiving packet: read tcp 127.0.0.1:48690->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.396+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5243256402013287547) connection "127.0.0.1:48690" response transport failed `read tcp 127.0.0.1:48690->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.426+05:30 [Error] receiving packet: read tcp 127.0.0.1:34358->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.426+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3366183429280951335) connection "127.0.0.1:34358" response transport failed `read tcp 127.0.0.1:34358->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.426+05:30 [Error] receiving packet: read tcp 127.0.0.1:48694->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.426+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3366183429280951335) connection "127.0.0.1:48694" response transport failed `read tcp 127.0.0.1:48694->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.443+05:30 [Error] receiving packet: read tcp 127.0.0.1:34364->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.443+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6618139498869570922) connection "127.0.0.1:34364" response transport failed `read tcp 127.0.0.1:34364->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.443+05:30 [Error] receiving packet: read tcp 127.0.0.1:48696->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.443+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6618139498869570922) connection "127.0.0.1:48696" response transport failed `read tcp 127.0.0.1:48696->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.455+05:30 [Error] receiving packet: read tcp 127.0.0.1:48700->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.455+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5219559874593968101) connection "127.0.0.1:48700" response transport failed `read tcp 127.0.0.1:48700->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.459+05:30 [Error] receiving packet: read tcp 127.0.0.1:34370->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.459+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5219559874593968101) connection "127.0.0.1:34370" response transport failed `read tcp 127.0.0.1:34370->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.468+05:30 [Error] receiving packet: read tcp 127.0.0.1:48706->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.468+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5094372542684785816) connection "127.0.0.1:48706" response transport failed `read tcp 127.0.0.1:48706->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.481+05:30 [Error] receiving packet: read tcp 127.0.0.1:34376->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.481+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8349752324449137326) connection "127.0.0.1:34376" response transport failed `read tcp 127.0.0.1:34376->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.485+05:30 [Error] receiving packet: read tcp 127.0.0.1:48708->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.485+05:30 [Error] receiving packet: read tcp 127.0.0.1:34378->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.485+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8349752324449137326) connection "127.0.0.1:48708" response transport failed `read tcp 127.0.0.1:48708->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.485+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8349752324449137326) connection "127.0.0.1:34378" response transport failed `read tcp 127.0.0.1:34378->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.494+05:30 [Error] receiving packet: read tcp 127.0.0.1:34380->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.494+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(786555624563746895) connection "127.0.0.1:34380" response transport failed `read tcp 127.0.0.1:34380->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.507+05:30 [Error] receiving packet: read tcp 127.0.0.1:48720->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.507+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4766609074302547180) connection "127.0.0.1:48720" response transport failed `read tcp 127.0.0.1:48720->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.535+05:30 [Error] receiving packet: read tcp 127.0.0.1:48722->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.535+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1507717247770719761) connection "127.0.0.1:48722" response transport failed `read tcp 127.0.0.1:48722->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.536+05:30 [Error] receiving packet: read tcp 127.0.0.1:34382->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.536+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1507717247770719761) connection "127.0.0.1:34382" response transport failed `read tcp 127.0.0.1:34382->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.539+05:30 [Error] receiving packet: read tcp 127.0.0.1:34390->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.539+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1507717247770719761) connection "127.0.0.1:34390" response transport failed `read tcp 127.0.0.1:34390->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.549+05:30 [Error] receiving packet: read tcp 127.0.0.1:48726->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.549+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8808113625673421272) connection "127.0.0.1:48726" response transport failed `read tcp 127.0.0.1:48726->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.551+05:30 [Error] receiving packet: read tcp 127.0.0.1:48728->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.551+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8808113625673421272) connection "127.0.0.1:48728" response transport failed `read tcp 127.0.0.1:48728->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.555+05:30 [Error] receiving packet: read tcp 127.0.0.1:48730->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.555+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8808113625673421272) connection "127.0.0.1:48730" response transport failed `read tcp 127.0.0.1:48730->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.557+05:30 [Error] receiving packet: read tcp 127.0.0.1:34398->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.557+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8808113625673421272) connection "127.0.0.1:34398" response transport failed `read tcp 127.0.0.1:34398->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.560+05:30 [Error] receiving packet: read tcp 127.0.0.1:48734->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.560+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8808113625673421272) connection "127.0.0.1:48734" response transport failed `read tcp 127.0.0.1:48734->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.563+05:30 [Error] receiving packet: read tcp 127.0.0.1:48736->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.563+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8808113625673421272) connection "127.0.0.1:48736" response transport failed `read tcp 127.0.0.1:48736->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.574+05:30 [Error] receiving packet: read tcp 127.0.0.1:48740->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.574+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7253836918076445774) connection "127.0.0.1:48740" response transport failed `read tcp 127.0.0.1:48740->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.589+05:30 [Error] receiving packet: read tcp 127.0.0.1:48742->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.589+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5385875741086774285) connection "127.0.0.1:48742" response transport failed `read tcp 127.0.0.1:48742->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.592+05:30 [Error] receiving packet: read tcp 127.0.0.1:48744->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.592+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-5385875741086774285) connection "127.0.0.1:48744" response transport failed `read tcp 127.0.0.1:48744->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.616+05:30 [Error] receiving packet: read tcp 127.0.0.1:34404->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.616+05:30 [Error] receiving packet: read tcp 127.0.0.1:48748->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.616+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2919586658577084077) connection "127.0.0.1:34404" response transport failed `read tcp 127.0.0.1:34404->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.616+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2919586658577084077) connection "127.0.0.1:48748" response transport failed `read tcp 127.0.0.1:48748->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.647+05:30 [Error] receiving packet: read tcp 127.0.0.1:48750->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.647+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2667270665763310925) connection "127.0.0.1:48750" response transport failed `read tcp 127.0.0.1:48750->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.653+05:30 [Error] receiving packet: read tcp 127.0.0.1:34418->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.653+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3423195372295444973) connection "127.0.0.1:34418" response transport failed `read tcp 127.0.0.1:34418->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.661+05:30 [Error] receiving packet: read tcp 127.0.0.1:48754->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.661+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8830911126779321384) connection "127.0.0.1:48754" response transport failed `read tcp 127.0.0.1:48754->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.692+05:30 [Error] receiving packet: read tcp 127.0.0.1:34422->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.692+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1072009746816234339) connection "127.0.0.1:34422" response transport failed `read tcp 127.0.0.1:34422->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.699+05:30 [Error] receiving packet: read tcp 127.0.0.1:48758->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.699+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4555991127957130695) connection "127.0.0.1:48758" response transport failed `read tcp 127.0.0.1:48758->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.718+05:30 [Error] receiving packet: read tcp 127.0.0.1:34430->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.718+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6840552695568002135) connection "127.0.0.1:34430" response transport failed `read tcp 127.0.0.1:34430->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.732+05:30 [Error] receiving packet: read tcp 127.0.0.1:48770->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.732+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5148829790496708256) connection "127.0.0.1:48770" response transport failed `read tcp 127.0.0.1:48770->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.743+05:30 [Error] receiving packet: read tcp 127.0.0.1:34442->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.743+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8917148948413161970) connection "127.0.0.1:34442" response transport failed `read tcp 127.0.0.1:34442->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.746+05:30 [Error] receiving packet: read tcp 127.0.0.1:34444->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.746+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8917148948413161970) connection "127.0.0.1:34444" response transport failed `read tcp 127.0.0.1:34444->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.756+05:30 [Error] receiving packet: read tcp 127.0.0.1:34446->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.757+05:30 [Error] receiving packet: read tcp 127.0.0.1:48774->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.757+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2628763188925764056) connection "127.0.0.1:48774" response transport failed `read tcp 127.0.0.1:48774->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.757+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2628763188925764056) connection "127.0.0.1:34446" response transport failed `read tcp 127.0.0.1:34446->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.759+05:30 [Error] receiving packet: read tcp 127.0.0.1:48782->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.759+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2628763188925764056) connection "127.0.0.1:48782" response transport failed `read tcp 127.0.0.1:48782->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.853+05:30 [Error] receiving packet: read tcp 127.0.0.1:34450->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.854+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2423491041062074934) connection "127.0.0.1:34450" response transport failed `read tcp 127.0.0.1:34450->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.867+05:30 [Error] receiving packet: read tcp 127.0.0.1:34460->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.867+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(45284343318983421) connection "127.0.0.1:34460" response transport failed `read tcp 127.0.0.1:34460->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.867+05:30 [Error] receiving packet: read tcp 127.0.0.1:48786->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.867+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(45284343318983421) connection "127.0.0.1:48786" response transport failed `read tcp 127.0.0.1:48786->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.870+05:30 [Error] receiving packet: read tcp 127.0.0.1:34462->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.870+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(45284343318983421) connection "127.0.0.1:34462" response transport failed `read tcp 127.0.0.1:34462->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.873+05:30 [Error] receiving packet: read tcp 127.0.0.1:34464->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.873+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(45284343318983421) connection "127.0.0.1:34464" response transport failed `read tcp 127.0.0.1:34464->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.875+05:30 [Error] receiving packet: read tcp 127.0.0.1:34466->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.875+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(45284343318983421) connection "127.0.0.1:34466" response transport failed `read tcp 127.0.0.1:34466->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.878+05:30 [Error] receiving packet: read tcp 127.0.0.1:48802->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.878+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(45284343318983421) connection "127.0.0.1:48802" response transport failed `read tcp 127.0.0.1:48802->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:23.881+05:30 [Error] receiving packet: read tcp 127.0.0.1:34470->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:23.881+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(45284343318983421) connection "127.0.0.1:34470" response transport failed `read tcp 127.0.0.1:34470->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:23.882+05:30 [Error] receiving packet: read tcp 127.0.0.1:48806->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:23.882+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(45284343318983421) connection "127.0.0.1:48806" response transport failed `read tcp 127.0.0.1:48806->127.0.0.1:9113: i/o timeout`
2023/02/16 21:00:27 Rebalance progress: 24
2023-02-16T21:00:27.294+05:30 [Error] receiving packet: read tcp 127.0.0.1:34474->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.294+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(1126842458120882062) connection "127.0.0.1:34474" response transport failed `read tcp 127.0.0.1:34474->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.302+05:30 [Error] receiving packet: read tcp 127.0.0.1:34542->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.302+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7925322996383300867) connection "127.0.0.1:34542" response transport failed `read tcp 127.0.0.1:34542->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.305+05:30 [Error] receiving packet: read tcp 127.0.0.1:34544->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.305+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7925322996383300867) connection "127.0.0.1:34544" response transport failed `read tcp 127.0.0.1:34544->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.313+05:30 [Error] receiving packet: read tcp 127.0.0.1:34546->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.313+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(2228102627138669523) connection "127.0.0.1:34546" response transport failed `read tcp 127.0.0.1:34546->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.320+05:30 [Error] receiving packet: read tcp 127.0.0.1:48812->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.320+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5429813398286014359) connection "127.0.0.1:48812" response transport failed `read tcp 127.0.0.1:48812->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.333+05:30 [Error] receiving packet: read tcp 127.0.0.1:48884->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.333+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2517188769035238038) connection "127.0.0.1:48884" response transport failed `read tcp 127.0.0.1:48884->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.344+05:30 [Error] receiving packet: read tcp 127.0.0.1:34548->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.345+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6242671707268581835) connection "127.0.0.1:34548" response transport failed `read tcp 127.0.0.1:34548->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.345+05:30 [Error] receiving packet: read tcp 127.0.0.1:48886->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.345+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6242671707268581835) connection "127.0.0.1:48886" response transport failed `read tcp 127.0.0.1:48886->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.353+05:30 [Error] receiving packet: read tcp 127.0.0.1:34554->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.353+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6242671707268581835) connection "127.0.0.1:34554" response transport failed `read tcp 127.0.0.1:34554->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.384+05:30 [Error] receiving packet: read tcp 127.0.0.1:34556->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.384+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6597211022724008565) connection "127.0.0.1:34556" response transport failed `read tcp 127.0.0.1:34556->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.385+05:30 [Error] receiving packet: read tcp 127.0.0.1:48892->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.385+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6597211022724008565) connection "127.0.0.1:48892" response transport failed `read tcp 127.0.0.1:48892->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.392+05:30 [Error] receiving packet: read tcp 127.0.0.1:34562->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.392+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6597211022724008565) connection "127.0.0.1:34562" response transport failed `read tcp 127.0.0.1:34562->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.406+05:30 [Error] receiving packet: read tcp 127.0.0.1:48898->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.406+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6222074622669712373) connection "127.0.0.1:48898" response transport failed `read tcp 127.0.0.1:48898->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.419+05:30 [Error] receiving packet: read tcp 127.0.0.1:48900->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.419+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4275783715875076368) connection "127.0.0.1:48900" response transport failed `read tcp 127.0.0.1:48900->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.423+05:30 [Error] receiving packet: read tcp 127.0.0.1:34568->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.423+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4275783715875076368) connection "127.0.0.1:34568" response transport failed `read tcp 127.0.0.1:34568->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.448+05:30 [Error] receiving packet: read tcp 127.0.0.1:48904->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.448+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(9074346454380143085) connection "127.0.0.1:48904" response transport failed `read tcp 127.0.0.1:48904->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.454+05:30 [Error] receiving packet: read tcp 127.0.0.1:48910->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.454+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(9074346454380143085) connection "127.0.0.1:48910" response transport failed `read tcp 127.0.0.1:48910->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.463+05:30 [Error] receiving packet: read tcp 127.0.0.1:48912->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.463+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6197699905637376288) connection "127.0.0.1:48912" response transport failed `read tcp 127.0.0.1:48912->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.467+05:30 [Error] receiving packet: read tcp 127.0.0.1:34574->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.467+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6197699905637376288) connection "127.0.0.1:34574" response transport failed `read tcp 127.0.0.1:34574->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.467+05:30 [Error] receiving packet: read tcp 127.0.0.1:48914->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.467+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6197699905637376288) connection "127.0.0.1:48914" response transport failed `read tcp 127.0.0.1:48914->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.486+05:30 [Error] receiving packet: read tcp 127.0.0.1:48918->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.486+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1485043527521591171) connection "127.0.0.1:48918" response transport failed `read tcp 127.0.0.1:48918->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.500+05:30 [Error] receiving packet: read tcp 127.0.0.1:34582->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.500+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8243389078308426152) connection "127.0.0.1:34582" response transport failed `read tcp 127.0.0.1:34582->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.510+05:30 [Error] receiving packet: read tcp 127.0.0.1:34588->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.511+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-6162613609912502843) connection "127.0.0.1:34588" response transport failed `read tcp 127.0.0.1:34588->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.531+05:30 [Error] receiving packet: read tcp 127.0.0.1:48920->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.531+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(5184705301308677179) connection "127.0.0.1:48920" response transport failed `read tcp 127.0.0.1:48920->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.531+05:30 [Error] receiving packet: read tcp 127.0.0.1:34594->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.531+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(5184705301308677179) connection "127.0.0.1:34594" response transport failed `read tcp 127.0.0.1:34594->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.557+05:30 [Error] receiving packet: read tcp 127.0.0.1:48930->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.557+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6838210475655860366) connection "127.0.0.1:48930" response transport failed `read tcp 127.0.0.1:48930->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.567+05:30 [Error] receiving packet: read tcp 127.0.0.1:34598->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.567+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6851487559470874312) connection "127.0.0.1:34598" response transport failed `read tcp 127.0.0.1:34598->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.567+05:30 [Error] receiving packet: read tcp 127.0.0.1:48936->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.567+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6851487559470874312) connection "127.0.0.1:48936" response transport failed `read tcp 127.0.0.1:48936->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.603+05:30 [Error] receiving packet: read tcp 127.0.0.1:34606->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.603+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2288847024370749769) connection "127.0.0.1:34606" response transport failed `read tcp 127.0.0.1:34606->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.603+05:30 [Error] receiving packet: read tcp 127.0.0.1:48938->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.603+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2288847024370749769) connection "127.0.0.1:48938" response transport failed `read tcp 127.0.0.1:48938->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.607+05:30 [Error] receiving packet: read tcp 127.0.0.1:48946->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.607+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2288847024370749769) connection "127.0.0.1:48946" response transport failed `read tcp 127.0.0.1:48946->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.643+05:30 [Error] receiving packet: read tcp 127.0.0.1:34618->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.644+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8251015179821562822) connection "127.0.0.1:34618" response transport failed `read tcp 127.0.0.1:34618->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.645+05:30 [Error] receiving packet: read tcp 127.0.0.1:48948->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.645+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8251015179821562822) connection "127.0.0.1:48948" response transport failed `read tcp 127.0.0.1:48948->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.648+05:30 [Error] receiving packet: read tcp 127.0.0.1:48954->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.648+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8251015179821562822) connection "127.0.0.1:48954" response transport failed `read tcp 127.0.0.1:48954->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.652+05:30 [Error] receiving packet: read tcp 127.0.0.1:48956->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.652+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8251015179821562822) connection "127.0.0.1:48956" response transport failed `read tcp 127.0.0.1:48956->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.655+05:30 [Error] receiving packet: read tcp 127.0.0.1:48958->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.655+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(8251015179821562822) connection "127.0.0.1:48958" response transport failed `read tcp 127.0.0.1:48958->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.669+05:30 [Error] receiving packet: read tcp 127.0.0.1:34626->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.669+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8183661422308271044) connection "127.0.0.1:34626" response transport failed `read tcp 127.0.0.1:34626->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.718+05:30 [Error] receiving packet: read tcp 127.0.0.1:48964->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.718+05:30 [Error] receiving packet: read tcp 127.0.0.1:34628->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.718+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8230489961394221757) connection "127.0.0.1:34628" response transport failed `read tcp 127.0.0.1:34628->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.718+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8230489961394221757) connection "127.0.0.1:48964" response transport failed `read tcp 127.0.0.1:48964->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.721+05:30 [Error] receiving packet: read tcp 127.0.0.1:34632->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.721+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8230489961394221757) connection "127.0.0.1:34632" response transport failed `read tcp 127.0.0.1:34632->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.736+05:30 [Error] receiving packet: read tcp 127.0.0.1:48970->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.736+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-2832761239148265340) connection "127.0.0.1:48970" response transport failed `read tcp 127.0.0.1:48970->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.736+05:30 [Error] receiving packet: read tcp 127.0.0.1:34634->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.736+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-2832761239148265340) connection "127.0.0.1:34634" response transport failed `read tcp 127.0.0.1:34634->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.815+05:30 [Error] receiving packet: read tcp 127.0.0.1:48972->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.815+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6404017418739702369) connection "127.0.0.1:48972" response transport failed `read tcp 127.0.0.1:48972->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.824+05:30 [Error] receiving packet: read tcp 127.0.0.1:34640->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.824+05:30 [Error] receiving packet: read tcp 127.0.0.1:48986->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.824+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6404017418739702369) connection "127.0.0.1:48986" response transport failed `read tcp 127.0.0.1:48986->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.824+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6404017418739702369) connection "127.0.0.1:34640" response transport failed `read tcp 127.0.0.1:34640->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.855+05:30 [Error] receiving packet: read tcp 127.0.0.1:34654->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.855+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8836990352993827580) connection "127.0.0.1:34654" response transport failed `read tcp 127.0.0.1:34654->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.858+05:30 [Error] receiving packet: read tcp 127.0.0.1:34658->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.858+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-8836990352993827580) connection "127.0.0.1:34658" response transport failed `read tcp 127.0.0.1:34658->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.858+05:30 [Error] receiving packet: read tcp 127.0.0.1:48990->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.858+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8836990352993827580) connection "127.0.0.1:48990" response transport failed `read tcp 127.0.0.1:48990->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.862+05:30 [Error] receiving packet: read tcp 127.0.0.1:48994->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.862+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-8836990352993827580) connection "127.0.0.1:48994" response transport failed `read tcp 127.0.0.1:48994->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.879+05:30 [Error] receiving packet: read tcp 127.0.0.1:34664->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.879+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(8705816520062263861) connection "127.0.0.1:34664" response transport failed `read tcp 127.0.0.1:34664->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.888+05:30 [Error] receiving packet: read tcp 127.0.0.1:48996->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.888+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-158272059621125139) connection "127.0.0.1:48996" response transport failed `read tcp 127.0.0.1:48996->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.896+05:30 [Error] receiving packet: read tcp 127.0.0.1:49004->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.896+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(7227822581715549911) connection "127.0.0.1:49004" response transport failed `read tcp 127.0.0.1:49004->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.917+05:30 [Error] receiving packet: read tcp 127.0.0.1:34668->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.917+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-3515594456134944231) connection "127.0.0.1:34668" response transport failed `read tcp 127.0.0.1:34668->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.929+05:30 [Error] receiving packet: read tcp 127.0.0.1:34674->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.929+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(4230528336135525952) connection "127.0.0.1:34674" response transport failed `read tcp 127.0.0.1:34674->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.929+05:30 [Error] receiving packet: read tcp 127.0.0.1:49006->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.929+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(4230528336135525952) connection "127.0.0.1:49006" response transport failed `read tcp 127.0.0.1:49006->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.949+05:30 [Error] receiving packet: read tcp 127.0.0.1:49010->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.949+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-659171060452959531) connection "127.0.0.1:49010" response transport failed `read tcp 127.0.0.1:49010->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.957+05:30 [Error] receiving packet: read tcp 127.0.0.1:34678->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.957+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7198309254489781394) connection "127.0.0.1:34678" response transport failed `read tcp 127.0.0.1:34678->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.957+05:30 [Error] receiving packet: read tcp 127.0.0.1:49014->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.957+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7198309254489781394) connection "127.0.0.1:49014" response transport failed `read tcp 127.0.0.1:49014->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.962+05:30 [Error] receiving packet: read tcp 127.0.0.1:34682->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.962+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-7198309254489781394) connection "127.0.0.1:34682" response transport failed `read tcp 127.0.0.1:34682->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.971+05:30 [Error] receiving packet: read tcp 127.0.0.1:49020->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.971+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-3510869349958437967) connection "127.0.0.1:49020" response transport failed `read tcp 127.0.0.1:49020->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:27.982+05:30 [Error] receiving packet: read tcp 127.0.0.1:34684->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:27.983+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-4425374621287004333) connection "127.0.0.1:34684" response transport failed `read tcp 127.0.0.1:34684->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:27.993+05:30 [Error] receiving packet: read tcp 127.0.0.1:49024->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:27.993+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7444540342908855766) connection "127.0.0.1:49024" response transport failed `read tcp 127.0.0.1:49024->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.010+05:30 [Error] receiving packet: read tcp 127.0.0.1:49026->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.010+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(2072939636555317601) connection "127.0.0.1:49026" response transport failed `read tcp 127.0.0.1:49026->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.016+05:30 [Error] receiving packet: read tcp 127.0.0.1:34688->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.016+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6540994901657749144) connection "127.0.0.1:34688" response transport failed `read tcp 127.0.0.1:34688->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:28.028+05:30 [Error] receiving packet: read tcp 127.0.0.1:34694->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.028+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(-283829549800027838) connection "127.0.0.1:34694" response transport failed `read tcp 127.0.0.1:34694->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:28.049+05:30 [Error] receiving packet: read tcp 127.0.0.1:49030->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.049+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6284259301902719244) connection "127.0.0.1:49030" response transport failed `read tcp 127.0.0.1:49030->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.069+05:30 [Error] receiving packet: read tcp 127.0.0.1:49034->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.069+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1481643399309246669) connection "127.0.0.1:49034" response transport failed `read tcp 127.0.0.1:49034->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.079+05:30 [Error] receiving packet: read tcp 127.0.0.1:49038->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.079+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(1302772702476281476) connection "127.0.0.1:49038" response transport failed `read tcp 127.0.0.1:49038->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.087+05:30 [Error] receiving packet: read tcp 127.0.0.1:49040->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.087+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-6021818326167751346) connection "127.0.0.1:49040" response transport failed `read tcp 127.0.0.1:49040->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.096+05:30 [Error] receiving packet: read tcp 127.0.0.1:34698->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.096+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6578738255187949791) connection "127.0.0.1:34698" response transport failed `read tcp 127.0.0.1:34698->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:28.106+05:30 [Error] receiving packet: read tcp 127.0.0.1:34712->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.106+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(3852991799209967918) connection "127.0.0.1:34712" response transport failed `read tcp 127.0.0.1:34712->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:28.115+05:30 [Error] receiving packet: read tcp 127.0.0.1:49042->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.115+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-4958206185072871825) connection "127.0.0.1:49042" response transport failed `read tcp 127.0.0.1:49042->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.123+05:30 [Error] receiving packet: read tcp 127.0.0.1:49050->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.123+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7219362457125409548) connection "127.0.0.1:49050" response transport failed `read tcp 127.0.0.1:49050->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.127+05:30 [Error] receiving packet: read tcp 127.0.0.1:49052->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.127+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(-7219362457125409548) connection "127.0.0.1:49052" response transport failed `read tcp 127.0.0.1:49052->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.135+05:30 [Error] receiving packet: read tcp 127.0.0.1:34714->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.135+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6402451853971523405) connection "127.0.0.1:34714" response transport failed `read tcp 127.0.0.1:34714->127.0.0.1:9107: i/o timeout`
2023-02-16T21:00:28.135+05:30 [Error] receiving packet: read tcp 127.0.0.1:49054->127.0.0.1:9113: i/o timeout
2023-02-16T21:00:28.135+05:30 [Error] [GsiScanClient:"127.0.0.1:9113"] req(6402451853971523405) connection "127.0.0.1:49054" response transport failed `read tcp 127.0.0.1:49054->127.0.0.1:9113: i/o timeout`
2023-02-16T21:00:28.137+05:30 [Error] receiving packet: read tcp 127.0.0.1:34722->127.0.0.1:9107: i/o timeout
2023-02-16T21:00:28.138+05:30 [Error] [GsiScanClient:"127.0.0.1:9107"] req(6402451853971523405) connection "127.0.0.1:34722" response transport failed `read tcp 127.0.0.1:34722->127.0.0.1:9107: i/o timeout`
2023/02/16 21:00:32 Rebalance progress: 24
2023/02/16 21:00:37 Rebalance progress: 24
2023/02/16 21:00:42 Rebalance progress: 24
2023/02/16 21:00:47 Rebalance progress: 24
2023/02/16 21:00:52 Rebalance progress: 24
2023/02/16 21:00:57 Rebalance progress: 24
2023/02/16 21:01:02 Rebalance progress: 24
2023/02/16 21:01:07 Rebalance progress: 24
2023/02/16 21:01:12 Rebalance progress: 24
2023/02/16 21:01:17 Rebalance progress: 24
2023/02/16 21:01:22 Rebalance progress: 24
2023/02/16 21:01:27 Rebalance progress: 24
2023/02/16 21:01:32 Rebalance progress: 24
2023/02/16 21:01:37 Rebalance progress: 24
2023/02/16 21:01:42 Rebalance progress: 24
2023/02/16 21:01:47 Rebalance progress: 24
2023/02/16 21:01:52 Rebalance progress: 24
2023/02/16 21:01:57 Rebalance progress: 24
2023/02/16 21:02:02 Rebalance progress: 38
2023/02/16 21:02:07 Rebalance progress: 38
2023/02/16 21:02:12 Rebalance progress: 38
2023/02/16 21:02:17 Rebalance progress: 38
2023/02/16 21:02:22 Rebalance progress: 38
2023/02/16 21:02:27 Rebalance progress: 38
2023/02/16 21:02:32 Rebalance progress: 38
2023/02/16 21:02:37 Rebalance progress: 38
2023/02/16 21:02:42 Rebalance progress: 38
2023/02/16 21:02:47 Rebalance progress: 38
2023/02/16 21:02:52 Rebalance progress: 38
2023/02/16 21:02:57 Rebalance progress: 38
2023/02/16 21:03:02 Rebalance progress: 38
2023/02/16 21:03:07 Rebalance progress: 38
2023/02/16 21:03:12 Rebalance progress: 38
2023/02/16 21:03:17 Rebalance progress: 38
2023/02/16 21:03:22 Rebalance progress: 38
2023/02/16 21:03:27 Rebalance progress: 38
2023/02/16 21:03:32 Rebalance progress: 38
2023/02/16 21:03:37 Rebalance progress: 38
2023/02/16 21:03:42 Rebalance progress: 38
2023/02/16 21:03:47 Rebalance progress: 38
2023/02/16 21:03:52 Rebalance progress: 38
2023/02/16 21:03:57 Rebalance progress: 38
2023/02/16 21:04:02 Rebalance progress: 38
2023/02/16 21:04:07 Rebalance progress: 38
2023/02/16 21:04:12 Rebalance progress: 38
2023/02/16 21:04:17 Rebalance progress: 38
2023/02/16 21:04:22 Rebalance progress: 38
2023/02/16 21:04:27 Rebalance progress: 38
2023/02/16 21:04:32 Rebalance progress: 38
2023/02/16 21:04:37 Rebalance progress: 38
2023/02/16 21:04:42 Rebalance progress: 38
2023/02/16 21:04:47 Rebalance progress: 38
2023/02/16 21:04:52 Rebalance progress: 38
2023/02/16 21:04:57 Rebalance progress: 38
2023/02/16 21:05:02 Rebalance progress: 38
2023/02/16 21:05:07 Rebalance progress: 38
2023/02/16 21:05:12 Rebalance progress: 38
2023/02/16 21:05:17 Rebalance progress: 38
2023/02/16 21:05:22 Rebalance progress: 38
2023/02/16 21:05:27 Rebalance progress: 38
2023/02/16 21:05:32 Rebalance progress: 38
2023/02/16 21:05:37 Rebalance progress: 38
2023/02/16 21:05:42 Rebalance progress: 38
2023/02/16 21:05:47 Rebalance progress: 38
2023/02/16 21:05:52 Rebalance progress: 38
2023/02/16 21:05:57 Rebalance progress: 38
2023/02/16 21:06:02 Rebalance progress: 38
2023/02/16 21:06:07 Rebalance progress: 38
2023/02/16 21:06:12 Rebalance progress: 38
2023/02/16 21:06:17 Rebalance progress: 38
2023/02/16 21:06:22 Rebalance progress: 38
2023/02/16 21:06:27 Rebalance progress: 38
2023/02/16 21:06:32 Rebalance progress: 38
2023/02/16 21:06:37 Rebalance progress: 38
2023/02/16 21:06:42 Rebalance progress: 38
2023/02/16 21:06:47 Rebalance progress: 38
2023/02/16 21:06:52 Rebalance progress: 38
2023/02/16 21:06:57 Rebalance progress: 38
2023/02/16 21:07:02 Rebalance progress: 38
2023/02/16 21:07:07 Rebalance progress: 38
2023/02/16 21:07:12 Rebalance progress: 38
2023/02/16 21:07:17 Rebalance progress: 38
2023/02/16 21:07:22 Rebalance progress: 38
2023/02/16 21:07:27 Rebalance progress: 38
2023/02/16 21:07:32 Rebalance progress: 38
2023/02/16 21:07:37 Rebalance progress: 38
2023/02/16 21:07:42 Rebalance progress: 38
2023/02/16 21:07:47 Rebalance progress: 38
2023/02/16 21:07:52 Rebalance progress: 38
2023/02/16 21:07:57 Rebalance progress: 38
2023/02/16 21:08:02 Rebalance progress: 38
2023/02/16 21:08:07 Rebalance progress: 38
2023/02/16 21:08:12 Rebalance progress: 38
2023/02/16 21:08:17 Rebalance progress: 38
2023/02/16 21:08:22 Rebalance progress: 38
2023/02/16 21:08:27 Rebalance progress: 38
2023/02/16 21:08:32 Rebalance progress: 38
2023/02/16 21:08:37 Rebalance progress: 38
2023/02/16 21:08:42 Rebalance progress: 38
2023/02/16 21:08:47 Rebalance progress: 38
2023/02/16 21:08:52 Rebalance progress: 38
2023/02/16 21:08:57 Rebalance progress: 38
2023/02/16 21:09:02 Rebalance progress: 38
2023/02/16 21:09:07 Rebalance progress: 38
2023/02/16 21:09:12 Rebalance progress: 38
2023/02/16 21:09:17 Rebalance progress: 38
2023/02/16 21:09:22 Rebalance progress: 38
2023/02/16 21:09:27 Rebalance progress: 38
2023/02/16 21:09:32 Rebalance progress: 38
2023/02/16 21:09:37 Rebalance progress: 38
2023/02/16 21:09:42 Rebalance progress: 38
2023/02/16 21:09:47 Rebalance progress: 38
2023/02/16 21:09:52 Rebalance progress: 38
2023/02/16 21:09:57 Rebalance progress: 38
2023/02/16 21:10:02 Rebalance progress: 38
2023/02/16 21:10:07 Rebalance progress: 38
2023/02/16 21:10:12 Rebalance progress: 38
2023/02/16 21:10:17 Rebalance progress: 38
2023/02/16 21:10:22 Rebalance progress: 38
2023/02/16 21:10:27 Rebalance progress: 38
2023/02/16 21:10:32 Rebalance progress: 38
2023/02/16 21:10:37 Rebalance progress: 38
2023/02/16 21:10:42 Rebalance progress: 38
2023/02/16 21:10:47 Rebalance progress: 38
2023/02/16 21:10:52 Rebalance progress: 38
2023/02/16 21:10:57 Rebalance progress: 38
2023/02/16 21:11:02 Rebalance progress: 38
2023/02/16 21:11:07 Rebalance progress: 38
2023/02/16 21:11:12 Rebalance progress: 38
2023/02/16 21:11:17 Rebalance progress: 38
2023/02/16 21:11:22 Rebalance progress: 38
2023/02/16 21:11:27 Rebalance progress: 38
2023/02/16 21:11:32 Rebalance progress: 38
2023/02/16 21:11:37 Rebalance progress: 38
2023/02/16 21:11:42 Rebalance progress: 38
2023/02/16 21:11:47 Rebalance progress: 38
2023/02/16 21:11:52 Rebalance progress: 38
2023/02/16 21:11:57 Rebalance progress: 38
2023/02/16 21:12:02 Rebalance progress: 38
2023/02/16 21:12:07 Rebalance progress: 38
2023/02/16 21:12:12 Rebalance progress: 38
2023/02/16 21:12:17 Rebalance progress: 38
2023/02/16 21:12:22 Rebalance progress: 38
2023/02/16 21:12:27 Rebalance progress: 38
2023/02/16 21:12:32 Rebalance progress: 38
2023/02/16 21:12:37 Rebalance progress: 38
2023/02/16 21:12:42 Rebalance progress: 38
2023/02/16 21:12:47 Rebalance progress: 38
2023/02/16 21:12:52 Rebalance progress: 38
2023/02/16 21:12:57 Rebalance progress: 38
2023/02/16 21:13:02 Rebalance progress: 38
2023/02/16 21:13:07 Rebalance progress: 38
2023/02/16 21:13:12 Rebalance progress: 38
2023/02/16 21:13:17 Rebalance progress: 38
2023/02/16 21:13:22 Rebalance progress: 38
2023/02/16 21:13:27 Rebalance progress: 38
2023/02/16 21:13:32 Rebalance progress: 38
2023/02/16 21:13:37 Rebalance progress: 38
2023/02/16 21:13:42 Rebalance progress: 38
2023/02/16 21:13:47 Rebalance progress: 38
2023/02/16 21:13:52 Rebalance progress: 38
2023/02/16 21:13:57 Rebalance progress: 38
2023/02/16 21:14:02 Rebalance progress: 38
2023/02/16 21:14:07 Rebalance progress: 38
2023/02/16 21:14:12 Rebalance progress: 38
2023/02/16 21:14:17 Rebalance progress: 38
2023/02/16 21:14:22 Rebalance progress: 38
2023/02/16 21:14:27 Rebalance progress: 38
2023/02/16 21:14:32 Rebalance progress: 38
2023/02/16 21:14:37 Rebalance progress: 38
2023/02/16 21:14:42 Rebalance progress: 38
2023/02/16 21:14:47 Rebalance progress: 38
2023/02/16 21:14:52 Rebalance progress: 38
2023/02/16 21:14:57 Rebalance progress: 38
2023/02/16 21:15:02 Rebalance progress: 38
2023/02/16 21:15:07 Rebalance progress: 38
2023/02/16 21:15:12 Rebalance progress: 38
2023/02/16 21:15:17 Rebalance progress: 38
2023/02/16 21:15:22 Rebalance progress: 38
2023/02/16 21:15:27 Rebalance progress: 38
2023/02/16 21:15:32 Rebalance progress: 38
2023/02/16 21:15:37 Rebalance progress: 38
2023/02/16 21:15:42 Rebalance progress: 38
2023/02/16 21:15:47 Rebalance progress: 38
2023/02/16 21:15:52 Rebalance progress: 38
2023/02/16 21:15:57 Rebalance progress: 38
2023/02/16 21:16:02 Rebalance progress: 38
2023/02/16 21:16:07 Rebalance progress: 38
2023/02/16 21:16:12 Rebalance progress: 38
2023/02/16 21:16:17 Rebalance progress: 38
2023/02/16 21:16:22 Rebalance progress: 38
2023/02/16 21:16:27 Rebalance progress: 38
2023/02/16 21:16:32 Rebalance progress: 38
2023/02/16 21:16:37 Rebalance progress: 38
2023/02/16 21:16:42 Rebalance progress: 38
2023/02/16 21:16:47 Rebalance progress: 38
2023/02/16 21:16:52 Rebalance progress: 38
2023/02/16 21:16:57 Rebalance progress: 38
2023/02/16 21:17:02 Rebalance progress: 38
2023/02/16 21:17:07 Rebalance progress: 38
2023/02/16 21:17:12 Rebalance progress: 38
2023/02/16 21:17:17 Rebalance progress: 38
2023/02/16 21:17:22 Rebalance progress: 38
2023/02/16 21:17:27 Rebalance progress: 38
2023/02/16 21:17:32 Rebalance progress: 38
2023/02/16 21:17:37 Rebalance progress: 38
2023/02/16 21:17:42 Rebalance progress: 38
2023/02/16 21:17:47 Rebalance progress: 38
2023/02/16 21:17:52 Rebalance progress: 38
2023/02/16 21:17:57 Rebalance progress: 38
2023/02/16 21:18:02 Rebalance progress: 38
2023/02/16 21:18:07 Rebalance progress: 38
2023/02/16 21:18:12 Rebalance progress: 38
2023/02/16 21:18:17 Rebalance progress: 38
2023/02/16 21:18:22 Rebalance progress: 38
2023/02/16 21:18:27 Rebalance progress: 38
2023/02/16 21:18:32 Rebalance progress: 38
2023/02/16 21:18:37 Rebalance progress: 38
2023/02/16 21:18:42 Rebalance progress: 38
2023/02/16 21:18:47 Rebalance progress: 38
2023/02/16 21:18:52 Rebalance progress: 38
2023/02/16 21:18:57 Rebalance progress: 38
2023/02/16 21:19:02 Rebalance progress: 38
2023/02/16 21:19:07 Rebalance progress: 38
2023/02/16 21:19:12 Rebalance progress: 38
2023/02/16 21:19:17 Rebalance progress: 38
2023/02/16 21:19:22 Rebalance progress: 38
2023/02/16 21:19:27 Rebalance progress: 38
2023/02/16 21:19:32 Rebalance progress: 38
2023/02/16 21:19:37 Rebalance progress: 38
2023/02/16 21:19:42 Rebalance progress: 38
2023/02/16 21:19:47 Rebalance progress: 38
2023/02/16 21:19:52 Rebalance progress: 38
2023/02/16 21:19:57 Rebalance progress: 38
2023/02/16 21:20:02 Rebalance progress: 38
2023/02/16 21:20:07 Rebalance progress: 38
2023/02/16 21:20:12 Rebalance progress: 38
2023/02/16 21:20:17 Rebalance progress: 38
2023/02/16 21:20:22 Rebalance progress: 38
2023/02/16 21:20:27 Rebalance progress: 38
2023/02/16 21:20:32 Rebalance progress: 38
2023/02/16 21:20:37 Rebalance progress: 38
2023/02/16 21:20:42 Rebalance progress: 38
2023/02/16 21:20:47 Rebalance progress: 38
2023/02/16 21:20:52 Rebalance progress: 38
2023/02/16 21:20:57 Rebalance progress: 38
2023/02/16 21:21:02 Rebalance progress: 38
2023/02/16 21:21:07 Rebalance progress: 38
2023/02/16 21:21:12 Rebalance progress: 38
2023/02/16 21:21:17 Rebalance progress: 38
2023/02/16 21:21:22 Rebalance progress: 38
2023/02/16 21:21:27 Rebalance progress: 38
2023/02/16 21:21:32 Rebalance progress: 38
2023/02/16 21:21:37 Rebalance progress: 38
2023/02/16 21:21:42 Rebalance progress: 38
2023/02/16 21:21:47 Rebalance progress: 38
2023/02/16 21:21:52 Rebalance progress: 38
2023/02/16 21:21:57 Rebalance progress: 38
2023/02/16 21:22:02 Rebalance progress: 38
2023/02/16 21:22:07 Rebalance progress: 38
2023/02/16 21:22:12 Rebalance progress: 38
2023/02/16 21:22:17 Rebalance progress: 38
2023/02/16 21:22:22 Rebalance progress: 38
2023/02/16 21:22:27 Rebalance progress: 38
2023/02/16 21:22:32 Rebalance progress: 38
2023/02/16 21:22:37 Rebalance progress: 38
2023/02/16 21:22:42 Rebalance progress: 38
2023/02/16 21:22:47 Rebalance progress: 38
2023/02/16 21:22:52 Rebalance progress: 38
2023/02/16 21:22:57 Rebalance progress: 38
2023/02/16 21:23:02 Rebalance progress: 38
2023/02/16 21:23:07 Rebalance progress: 38
2023/02/16 21:23:12 Rebalance progress: 38
2023/02/16 21:23:17 Rebalance progress: 38
2023/02/16 21:23:22 Rebalance progress: 38
2023/02/16 21:23:27 Rebalance progress: 38
2023/02/16 21:23:32 Rebalance progress: 38
2023/02/16 21:23:37 Rebalance progress: 38
2023/02/16 21:23:42 Rebalance progress: 38
2023/02/16 21:23:47 Rebalance progress: 38
2023/02/16 21:23:52 Rebalance progress: 38
2023/02/16 21:23:57 Rebalance progress: 38
2023/02/16 21:24:02 Rebalance progress: 38
2023/02/16 21:24:07 Rebalance progress: 38
2023/02/16 21:24:12 Rebalance progress: 38
2023/02/16 21:24:17 Rebalance progress: 38
2023/02/16 21:24:22 Rebalance progress: 38
2023/02/16 21:24:27 Rebalance progress: 38
2023/02/16 21:24:32 Rebalance progress: 38
2023/02/16 21:24:37 Rebalance progress: 38
2023/02/16 21:24:42 Rebalance progress: 38
2023/02/16 21:24:47 Rebalance progress: 38
2023/02/16 21:24:52 Rebalance progress: 38
2023/02/16 21:24:57 Rebalance progress: 38
2023/02/16 21:25:02 Rebalance progress: 38
2023/02/16 21:25:07 Rebalance progress: 38
2023/02/16 21:25:12 Rebalance progress: 38
2023/02/16 21:25:17 Rebalance progress: 38
2023/02/16 21:25:22 Rebalance progress: 38
2023/02/16 21:25:27 Rebalance progress: 38
2023/02/16 21:25:32 Rebalance progress: 38
2023/02/16 21:25:37 Rebalance progress: 38
2023/02/16 21:25:42 Rebalance progress: 38
2023/02/16 21:25:47 Rebalance progress: 38
2023/02/16 21:25:52 Rebalance progress: 38
2023/02/16 21:25:57 Rebalance progress: 38
2023/02/16 21:26:02 Rebalance progress: 38
2023/02/16 21:26:07 Rebalance progress: 38
2023/02/16 21:26:12 Rebalance progress: 38
2023/02/16 21:26:17 Rebalance progress: 38
2023/02/16 21:26:22 Rebalance progress: 38
2023/02/16 21:26:27 Rebalance progress: 38
2023/02/16 21:26:32 Rebalance progress: 38
2023/02/16 21:26:37 Rebalance progress: 38
2023/02/16 21:26:42 Rebalance progress: 38
2023/02/16 21:26:47 Rebalance progress: 38
2023/02/16 21:26:52 Rebalance progress: 38
2023/02/16 21:26:57 Rebalance progress: 38
2023/02/16 21:27:02 Rebalance progress: 38
2023/02/16 21:27:07 Rebalance progress: 38
2023/02/16 21:27:12 Rebalance progress: 38
2023/02/16 21:27:17 Rebalance progress: 38
2023/02/16 21:27:22 Rebalance progress: 38
2023/02/16 21:27:27 Rebalance progress: 38
2023/02/16 21:27:32 Rebalance progress: 38
2023/02/16 21:27:37 Rebalance progress: 38
2023/02/16 21:27:42 Rebalance progress: 38
2023/02/16 21:27:47 Rebalance progress: 38
2023/02/16 21:27:52 Rebalance progress: 38
2023/02/16 21:27:57 Rebalance progress: 38
2023/02/16 21:28:02 Rebalance progress: 38
2023/02/16 21:28:07 Rebalance progress: 38
2023/02/16 21:28:12 Rebalance progress: 38
2023/02/16 21:28:17 Rebalance progress: 38
2023/02/16 21:28:22 Rebalance progress: 38
2023/02/16 21:28:27 Rebalance progress: 38
2023/02/16 21:28:32 Rebalance progress: 38
2023/02/16 21:28:37 Rebalance progress: 38
2023/02/16 21:28:42 Rebalance progress: 38
2023/02/16 21:28:47 Rebalance progress: 38
    common_test.go:165: Error while removing nodes: [127.0.0.1:9001 127.0.0.1:9002] from cluster: RemoveNodes: Error during rebalance, err: Rebalance did not finish after 30 minutes
--- FAIL: TestTwoNodeSwapRebalance (1816.85s)
=== RUN   TestSingleNodeSwapRebalance
2023/02/16 21:28:47 In TestSingleNodeSwapRebalance
2023/02/16 21:28:47 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:48 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:49 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:50 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:51 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:52 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:53 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:54 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:55 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:56 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:57 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:58 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:28:59 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:01 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:02 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:03 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:04 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:05 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:06 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:07 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:08 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:09 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:10 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:11 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:12 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:13 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:14 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:15 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:17 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:18 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
2023/02/16 21:29:19 Adding node: https://127.0.0.1:19002 with role: index to the cluster with uri: /pools/default/serverGroups/0/addNode
    common_test.go:165: Error while adding node 127.0.0.1:9002 cluster in server group: Group 1: AddNodeWithServerGroup: Unexpected response body while adding node: 127.0.0.1:9002 (role: index, serverGroup: Group 1), response: ["Node addition is disallowed while rebalance is in progress"]
--- FAIL: TestSingleNodeSwapRebalance (32.04s)
=== RUN   TestReplicaRepair
2023/02/16 21:29:19 In TestReplicaRepair
2023/02/16 21:29:19 Failing over: [127.0.0.1:9003]
    common_test.go:165: Error while failing over nodes: 127.0.0.1:9003 from cluster: Error removing node and rebalancing, rebalanceFromRest response: Rebalance running.
--- FAIL: TestReplicaRepair (0.06s)
=== RUN   TestReplicaRepairAndSwapRebalance
2023/02/16 21:29:19 In TestReplicaRepairAndSwapRebalance
2023/02/16 21:29:19 Failing over: [127.0.0.1:9002]
    common_test.go:165: Error while failing over nodes: 127.0.0.1:9002 from cluster: Error removing node and rebalancing, rebalanceFromRest response: Rebalance running.
--- FAIL: TestReplicaRepairAndSwapRebalance (0.05s)
=== RUN   TestBuildDeferredIndexesAfterRebalance
2023/02/16 21:29:19 In TestBuildDeferredIndexesAfterRebalance
2023/02/16 21:29:19 Build command issued for the deferred indexes [idx_secondary_defer], bucket: bucket_1, scope: _default, coll: _default
    set01_rebalance_test.go:176: Error observed while building indexes: idx_secondary_defer
--- FAIL: TestBuildDeferredIndexesAfterRebalance (0.08s)
=== RUN   TestDropIndexAfterRebalance
2023/02/16 21:29:19 In TestDropIndexAfterRebalance
2023/02/16 21:29:19 Dropping the secondary index idx_secondary
2023/02/16 21:29:19 Index dropped
2023/02/16 21:29:19 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary
2023/02/16 21:29:20 Index dropped
2023/02/16 21:29:20 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:21 Index dropped
2023/02/16 21:29:21 Dropping the secondary index idx_secondary
2023/02/16 21:29:21 Index dropped
2023/02/16 21:29:21 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:21 Index dropped
2023/02/16 21:29:21 Dropping the secondary index idx_secondary
2023/02/16 21:29:21 Index dropped
2023/02/16 21:29:21 Dropping the secondary index idx_secondary_defer
2023/02/16 21:29:21 Index dropped
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: _default
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: c1
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_1, scope: _default, collection: c2%
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: _default
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: c1
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary not found., index: idx_secondary, bucket: bucket_%2, scope: _default, collection: c2%
2023/02/16 21:31:26 Scan failed as expected with error: Index Not Found - cause: GSI index idx_secondary_defer not found., index: idx_secondary_defer, bucket: bucket_%2, scope: _default, collection: c2%
--- PASS: TestDropIndexAfterRebalance (127.32s)
=== RUN   TestRebalanceAfterDropIndexes
2023/02/16 21:31:26 In TestRebalanceAfterDropIndexes
2023/02/16 21:31:26 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:27 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:28 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:29 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:31 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:32 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:33 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:34 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:35 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:36 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:37 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:38 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:39 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:40 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:41 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:42 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:43 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:44 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:45 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:47 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:48 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:49 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:50 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:51 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:52 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:53 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:54 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:55 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:56 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:57 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:31:58 Adding node: https://127.0.0.1:19001 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
    common_test.go:165: Error while adding node 127.0.0.1:9001 cluster in server group: Group 2: AddNodeWithServerGroup: Unexpected response body while adding node: 127.0.0.1:9001 (role: index, serverGroup: Group 2), response: ["Node addition is disallowed while rebalance is in progress"]
--- FAIL: TestRebalanceAfterDropIndexes (32.05s)
=== RUN   TestCreateIndexsAfterRebalance
2023/02/16 21:31:58 In TestCreateIndexesAfterRebalance
2023/02/16 21:31:58 Executing N1ql statement: create index idx_secondary on `bucket_1`.`_default`.`_default`(age)
2023/02/16 21:33:13 Error in executing N1QL query, err: Post "http://172.31.5.112:9499/query/service": net/http: request canceled
    common_test.go:165: Error during n1qlExecute: create index idx_secondary on `bucket_1`.`_default`.`_default`(age): Post "http://172.31.5.112:9499/query/service": net/http: request canceled
--- FAIL: TestCreateIndexsAfterRebalance (75.03s)
=== RUN   TestRebalanceAfterDroppedCollections
2023/02/16 21:33:13 In TestRebalanceAfterDroppedCollections
2023/02/16 21:33:13 Dropped collection c1 for bucket: bucket_1, scope: _default, body: {"uid":"4"}
2023/02/16 21:33:14 Dropped collection c2% for bucket: bucket_1, scope: _default, body: {"uid":"5"}
2023/02/16 21:33:14 Dropped collection c1 for bucket: bucket_%2, scope: _default, body: {"uid":"4"}
2023/02/16 21:33:15 Dropped collection c2% for bucket: bucket_%2, scope: _default, body: {"uid":"5"}
2023/02/16 21:33:15 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:17 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:18 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:19 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:20 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:21 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:22 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:23 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:24 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:25 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:26 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:27 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:29 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:30 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:31 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:32 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:33 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:34 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:35 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:36 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:37 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:38 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:39 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:40 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:41 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:42 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:43 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:44 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:46 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:47 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
2023/02/16 21:33:48 Adding node: https://127.0.0.1:19003 with role: index to the cluster with uri: /pools/default/serverGroups/f5c2ee5ef6698970eee9566291d1ac81/addNode
    common_test.go:165: Error while adding node 127.0.0.1:9003 cluster in server group: Group 2: AddNodeWithServerGroup: Unexpected response body while adding node: 127.0.0.1:9003 (role: index, serverGroup: Group 2), response: ["Node addition is disallowed while rebalance is in progress"]
--- FAIL: TestRebalanceAfterDroppedCollections (34.39s)
=== RUN   TestRebalancePanicTestsSetup
2023/02/16 21:33:48 In DropAllSecondaryIndexes()
2023/02/16 21:33:48 Index found:  idx_secondary
2023/02/16 21:33:48 Error in DropAllSecondaryIndexes: The index was scheduled for background creation.The cleanup will happen in the background.
--- FAIL: TestRebalancePanicTestsSetup (0.05s)
panic: Error in DropAllSecondaryIndexes: The index was scheduled for background creation.The cleanup will happen in the background.
 [recovered]
	panic: Error in DropAllSecondaryIndexes: The index was scheduled for background creation.The cleanup will happen in the background.


goroutine 74516 [running]:
panic({0x106da60, 0xc00b467780})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/panic.go:987 +0x3ba fp=0xc01a583a60 sp=0xc01a5839a0 pc=0x441afa
testing.tRunner.func1.2({0x106da60, 0xc00b467780})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1396 +0x24e fp=0xc01a583b10 sp=0xc01a583a60 pc=0x526f8e
testing.tRunner.func1()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1399 +0x39f fp=0xc01a583c78 sp=0xc01a583b10 pc=0x526a1f
runtime.deferCallSave(0xc01a583d48, 0xc01a583f90?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/panic.go:796 +0x88 fp=0xc01a583c88 sp=0xc01a583c78 pc=0x4416e8
runtime.runOpenDeferFrame(0xc00a823400?, 0xc00dbbde50)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/panic.go:769 +0x1a5 fp=0xc01a583cd0 sp=0xc01a583c88 pc=0x441505
panic({0x106da60, 0xc00b467780})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/panic.go:884 +0x212 fp=0xc01a583d90 sp=0xc01a583cd0 pc=0x441952
log.Panicf({0x12761d1?, 0x20?}, {0xc01a583ed0?, 0x0?, 0x0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/log/log.go:395 +0x67 fp=0xc01a583dd8 sp=0xc01a583d90 pc=0x5db5c7
github.com/couchbase/indexing/secondary/tests/framework/common.HandleError(...)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/tests/framework/common/util.go:77
github.com/couchbase/indexing/secondary/tests/serverlesstests.TestRebalancePanicTestsSetup(0x1000000000001?)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/tests/serverlesstests/set02_rebalance_panic_test.go:18 +0xeb fp=0xc01a583f70 sp=0xc01a583dd8 pc=0xfb88ab
testing.tRunner(0xc04d1fd860, 0x12fffc8)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1446 +0x10b fp=0xc01a583fc0 sp=0xc01a583f70 pc=0x5265cb
testing.(*T).Run.func1()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1493 +0x2a fp=0xc01a583fe0 sp=0xc01a583fc0 pc=0x52746a
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc01a583fe8 sp=0xc01a583fe0 pc=0x476581
created by testing.(*T).Run
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1493 +0x35f

goroutine 1 [chan receive]:
runtime.gopark(0x1?, 0xc00024b8b8?, 0x1f?, 0x58?, 0xc00024b868?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00024b838 sp=0xc00024b818 pc=0x444c16
runtime.chanrecv(0xc00a0f2d90, 0xc00024b937, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00024b8c8 sp=0xc00024b838 pc=0x40ec3b
runtime.chanrecv1(0x129375f?, 0x129375f?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:442 +0x18 fp=0xc00024b8f0 sp=0xc00024b8c8 pc=0x40e738
testing.(*T).Run(0xc0001c6ea0, {0x129375f?, 0x525fa5?}, 0x12fffc8)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1494 +0x37a fp=0xc00024b990 sp=0xc00024b8f0 pc=0x5273ba
testing.runTests.func1(0xc005276ba0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1846 +0x6e fp=0xc00024b9e0 sp=0xc00024b990 pc=0x52930e
testing.tRunner(0xc0001c6ea0, 0xc00024bae8)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1446 +0x10b fp=0xc00024ba30 sp=0xc00024b9e0 pc=0x5265cb
testing.runTests(0xc0002560a0?, {0x1fa8260, 0x22, 0x22}, {0x30?, 0x11419e0?, 0x2034940?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1844 +0x456 fp=0xc00024bb18 sp=0xc00024ba30 pc=0x5291b6
testing.(*M).Run(0xc0002560a0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/testing/testing.go:1726 +0x5d9 fp=0xc00024bd10 sp=0xc00024bb18 pc=0x527c99
github.com/couchbase/indexing/secondary/tests/serverlesstests.TestMain(0xffffffffffffffff?)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/tests/serverlesstests/common_test.go:160 +0x14a5 fp=0xc00024bec8 sp=0xc00024bd10 pc=0xfa9e05
main.main()
	_testmain.go:115 +0x1d3 fp=0xc00024bf80 sp=0xc00024bec8 pc=0xfc7c73
runtime.main()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:250 +0x212 fp=0xc00024bfe0 sp=0xc00024bf80 pc=0x444852
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00024bfe8 sp=0xc00024bfe0 pc=0x476581

goroutine 2 [force gc (idle), 41 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006afb0 sp=0xc00006af90 pc=0x444c16
runtime.goparkunlock(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:369
runtime.forcegchelper()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:302 +0xad fp=0xc00006afe0 sp=0xc00006afb0 pc=0x444aad
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006afe8 sp=0xc00006afe0 pc=0x476581
created by runtime.init.6
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:290 +0x25

goroutine 18 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000066790 sp=0xc000066770 pc=0x444c16
runtime.goparkunlock(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:369
runtime.bgsweep(0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgcsweep.go:297 +0xd7 fp=0xc0000667c8 sp=0xc000066790 pc=0x42f097
runtime.gcenable.func1()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:178 +0x26 fp=0xc0000667e0 sp=0xc0000667c8 pc=0x423d06
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000667e8 sp=0xc0000667e0 pc=0x476581
created by runtime.gcenable
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:178 +0x6b

goroutine 19 [GC scavenge wait]:
runtime.gopark(0x2034ee0?, 0x30ec02?, 0x0?, 0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000066f70 sp=0xc000066f50 pc=0x444c16
runtime.goparkunlock(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:369
runtime.(*scavengerState).park(0x2034ee0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgcscavenge.go:389 +0x53 fp=0xc000066fa0 sp=0xc000066f70 pc=0x42d0f3
runtime.bgscavenge(0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgcscavenge.go:622 +0x65 fp=0xc000066fc8 sp=0xc000066fa0 pc=0x42d6e5
runtime.gcenable.func2()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:179 +0x26 fp=0xc000066fe0 sp=0xc000066fc8 pc=0x423ca6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000066fe8 sp=0xc000066fe0 pc=0x476581
created by runtime.gcenable
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:179 +0xaa

goroutine 20 [finalizer wait]:
runtime.gopark(0x0?, 0x13054a0?, 0x0?, 0x20?, 0x2000000020?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006a628 sp=0xc00006a608 pc=0x444c16
runtime.goparkunlock(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:369
runtime.runfinq()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mfinal.go:180 +0x10f fp=0xc00006a7e0 sp=0xc00006a628 pc=0x422e0f
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006a7e8 sp=0xc00006a7e0 pc=0x476581
created by runtime.createfing
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mfinal.go:157 +0x45

goroutine 3 [select, 41 minutes]:
runtime.gopark(0xc00006b798?, 0x2?, 0x0?, 0x0?, 0xc00006b784?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006b610 sp=0xc00006b5f0 pc=0x444c16
runtime.selectgo(0xc00006b798, 0xc00006b780, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00006b750 sp=0xc00006b610 pc=0x454a9c
github.com/couchbase/cbauth/cbauthimpl.(*tlsNotifier).loop(0xc0000120f0)
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:405 +0x67 fp=0xc00006b7c8 sp=0xc00006b750 pc=0x7ab667
github.com/couchbase/cbauth/cbauthimpl.NewSVCForTest.func2()
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:565 +0x26 fp=0xc00006b7e0 sp=0xc00006b7c8 pc=0x7ac2c6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006b7e8 sp=0xc00006b7e0 pc=0x476581
created by github.com/couchbase/cbauth/cbauthimpl.NewSVCForTest
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:565 +0x385

goroutine 4 [select, 41 minutes]:
runtime.gopark(0xc00006bf98?, 0x2?, 0x0?, 0x0?, 0xc00006bf8c?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006be08 sp=0xc00006bde8 pc=0x444c16
runtime.selectgo(0xc00006bf98, 0xc00006bf88, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00006bf48 sp=0xc00006be08 pc=0x454a9c
github.com/couchbase/cbauth/cbauthimpl.(*cfgChangeNotifier).loop(0xc000012108)
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:325 +0x85 fp=0xc00006bfc8 sp=0xc00006bf48 pc=0x7ab0c5
github.com/couchbase/cbauth/cbauthimpl.NewSVCForTest.func3()
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:566 +0x26 fp=0xc00006bfe0 sp=0xc00006bfc8 pc=0x7ac266
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006bfe8 sp=0xc00006bfe0 pc=0x476581
created by github.com/couchbase/cbauth/cbauthimpl.NewSVCForTest
	/opt/build/goproj/src/github.com/couchbase/cbauth/cbauthimpl/impl.go:566 +0x3d6

goroutine 5 [IO wait]:
runtime.gopark(0xc005ad0cb8?, 0xb?, 0x0?, 0x0?, 0x6?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000155780 sp=0xc000155760 pc=0x444c16
runtime.netpollblock(0x4897c5?, 0xf2000?, 0xc0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:526 +0xf7 fp=0xc0001557b8 sp=0xc000155780 pc=0x43d3f7
internal/poll.runtime_pollWait(0x7f05f8758918, 0x72)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:305 +0x89 fp=0xc0001557d8 sp=0xc0001557b8 pc=0x470a09
internal/poll.(*pollDesc).wait(0xc00012a080?, 0xc0001a5000?, 0x0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc000155800 sp=0xc0001557d8 pc=0x4ab6f2
internal/poll.(*pollDesc).waitRead(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00012a080, {0xc0001a5000, 0x1000, 0x1000})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_unix.go:167 +0x25a fp=0xc000155880 sp=0xc000155800 pc=0x4aca5a
net.(*netFD).Read(0xc00012a080, {0xc0001a5000?, 0xc000155918?, 0x56f0ff?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/fd_posix.go:55 +0x29 fp=0xc0001558c8 sp=0xc000155880 pc=0x68ae29
net.(*conn).Read(0xc0001284f8, {0xc0001a5000?, 0x570216?, 0x10744c0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/net.go:183 +0x45 fp=0xc000155910 sp=0xc0001558c8 pc=0x69b0a5
bufio.(*Reader).Read(0xc000114540, {0xc008ed9001, 0xdff, 0x45a394?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/bufio/bufio.go:237 +0x1bb fp=0xc000155948 sp=0xc000155910 pc=0x52c99b
github.com/couchbase/cbauth/revrpc.(*minirwc).Read(0x1?, {0xc008ed9001?, 0x8?, 0x20?})
	/opt/build/goproj/src/github.com/couchbase/cbauth/revrpc/revrpc.go:103 +0x25 fp=0xc000155978 sp=0xc000155948 pc=0x813c85
encoding/json.(*Decoder).refill(0xc000003540)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:165 +0x188 fp=0xc0001559c8 sp=0xc000155978 pc=0x582168
encoding/json.(*Decoder).readValue(0xc000003540)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:140 +0xbb fp=0xc000155a18 sp=0xc0001559c8 pc=0x581d5b
encoding/json.(*Decoder).Decode(0xc000003540, {0x10ad6a0, 0xc00009c380})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:63 +0x78 fp=0xc000155a48 sp=0xc000155a18 pc=0x581998
net/rpc/jsonrpc.(*serverCodec).ReadRequestHeader(0xc00009c360, 0xc000056540)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/rpc/jsonrpc/server.go:66 +0x85 fp=0xc000155a88 sp=0xc000155a48 pc=0x812ee5
github.com/couchbase/cbauth/revrpc.(*jsonServerCodec).ReadRequestHeader(0xc0000a2690?, 0xc000155b38?)
	:1 +0x2a fp=0xc000155aa8 sp=0xc000155a88 pc=0x81600a
net/rpc.(*Server).readRequestHeader(0xc0000a2690, {0x15cddf0, 0xc0000806c0})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/rpc/server.go:588 +0x66 fp=0xc000155b78 sp=0xc000155aa8 pc=0x8114c6
net/rpc.(*Server).readRequest(0x0?, {0x15cddf0, 0xc0000806c0})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/rpc/server.go:548 +0x3b fp=0xc000155c50 sp=0xc000155b78 pc=0x81101b
net/rpc.(*Server).ServeCodec(0xc0000a2690, {0x15cddf0?, 0xc0000806c0})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/rpc/server.go:463 +0x87 fp=0xc000155d48 sp=0xc000155c50 pc=0x810567
github.com/couchbase/cbauth/revrpc.(*Service).Run(0xc00006c760?, 0xc000159fa0)
	/opt/build/goproj/src/github.com/couchbase/cbauth/revrpc/revrpc.go:236 +0x77f fp=0xc000155f38 sp=0xc000155d48 pc=0x8149bf
github.com/couchbase/cbauth/revrpc.BabysitService(0x0?, 0x0?, {0x15c2040?, 0xc000132618?})
	/opt/build/goproj/src/github.com/couchbase/cbauth/revrpc/revrpc.go:335 +0x58 fp=0xc000155f70 sp=0xc000155f38 pc=0x815058
github.com/couchbase/cbauth.runRPCForSvc(0x0?, 0xc0000b6000)
	/opt/build/goproj/src/github.com/couchbase/cbauth/default.go:57 +0xbd fp=0xc000155fc0 sp=0xc000155f70 pc=0x81f5dd
github.com/couchbase/cbauth.startDefault.func1()
	/opt/build/goproj/src/github.com/couchbase/cbauth/default.go:66 +0x25 fp=0xc000155fe0 sp=0xc000155fc0 pc=0x81f8c5
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000155fe8 sp=0xc000155fe0 pc=0x476581
created by github.com/couchbase/cbauth.startDefault
	/opt/build/goproj/src/github.com/couchbase/cbauth/default.go:65 +0xf9

goroutine 7 [GC worker (idle), 40 minutes]:
runtime.gopark(0x253e309943c4?, 0x3?, 0xa0?, 0x8f?, 0x16?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006cf50 sp=0xc00006cf30 pc=0x444c16
runtime.gcBgMarkWorker()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1235 +0xf1 fp=0xc00006cfe0 sp=0xc00006cf50 pc=0x425e51
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006cfe8 sp=0xc00006cfe0 pc=0x476581
created by runtime.gcBgMarkStartWorkers
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1159 +0x25

goroutine 34 [chan receive, 41 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0004206f8 sp=0xc0004206d8 pc=0x444c16
runtime.chanrecv(0xc0002fc900, 0x0, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc000420788 sp=0xc0004206f8 pc=0x40ec3b
runtime.chanrecv1(0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:442 +0x18 fp=0xc0004207b0 sp=0xc000420788 pc=0x40e738
github.com/couchbase/regulator/utils/logging.duplicateFlushLoop()
	/opt/build/goproj/src/github.com/couchbase/regulator/utils/logging/logger_golog.go:253 +0x25 fp=0xc0004207d0 sp=0xc0004207b0 pc=0xa40f05
github.com/couchbase/regulator/utils/logging.init.0.func1()
	/opt/build/goproj/src/github.com/couchbase/regulator/utils/logging/logger_golog.go:49 +0x25 fp=0xc0004207e0 sp=0xc0004207d0 pc=0xa436a5
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0004207e8 sp=0xc0004207e0 pc=0x476581
created by github.com/couchbase/regulator/utils/logging.init.0
	/opt/build/goproj/src/github.com/couchbase/regulator/utils/logging/logger_golog.go:49 +0x7e

goroutine 35 [GC worker (idle), 40 minutes]:
runtime.gopark(0x253e3099866b?, 0x1?, 0x4a?, 0x6e?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000420f50 sp=0xc000420f30 pc=0x444c16
runtime.gcBgMarkWorker()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1235 +0xf1 fp=0xc000420fe0 sp=0xc000420f50 pc=0x425e51
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000420fe8 sp=0xc000420fe0 pc=0x476581
created by runtime.gcBgMarkStartWorkers
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1159 +0x25

goroutine 50 [GC worker (idle), 40 minutes]:
runtime.gopark(0x2071b00?, 0x3?, 0x80?, 0xb3?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000067750 sp=0xc000067730 pc=0x444c16
runtime.gcBgMarkWorker()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1235 +0xf1 fp=0xc0000677e0 sp=0xc000067750 pc=0x425e51
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000677e8 sp=0xc0000677e0 pc=0x476581
created by runtime.gcBgMarkStartWorkers
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1159 +0x25

goroutine 36 [GC worker (idle), 40 minutes]:
runtime.gopark(0x253e2d00d982?, 0x3?, 0x84?, 0x53?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000421750 sp=0xc000421730 pc=0x444c16
runtime.gcBgMarkWorker()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1235 +0xf1 fp=0xc0004217e0 sp=0xc000421750 pc=0x425e51
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0004217e8 sp=0xc0004217e0 pc=0x476581
created by runtime.gcBgMarkStartWorkers
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/mgc.go:1159 +0x25

goroutine 22 [chan receive]:
runtime.gopark(0xc00f5416e0?, 0xc00009c658?, 0xe?, 0x17?, 0x2?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00006c6b0 sp=0xc00006c690 pc=0x444c16
runtime.chanrecv(0xc00009c600, 0xc00006c798, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00006c740 sp=0xc00006c6b0 pc=0x40ec3b
runtime.chanrecv2(0xbebc200?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc00006c768 sp=0xc00006c740 pc=0x40e778
github.com/couchbase/plasma.(*smrManager).run(0xc0001b0330)
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:526 +0x96 fp=0xc00006c7c8 sp=0xc00006c768 pc=0xf56c76
github.com/couchbase/plasma.NewSmrManager.func1()
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:435 +0x26 fp=0xc00006c7e0 sp=0xc00006c7c8 pc=0xf56ba6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00006c7e8 sp=0xc00006c7e0 pc=0x476581
created by github.com/couchbase/plasma.NewSmrManager
	/opt/build/goproj/src/github.com/couchbase/plasma/smr.go:435 +0xaf

goroutine 23 [chan receive]:
runtime.gopark(0x1fe4b40?, 0x106cee0?, 0x40?, 0x4b?, 0x106dce0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00084beb0 sp=0xc00084be90 pc=0x444c16
runtime.chanrecv(0xc000114600, 0xc00084bfb0, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00084bf40 sp=0xc00084beb0 pc=0x40ec3b
runtime.chanrecv2(0xc00010c500?, 0x64?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc00084bf68 sp=0xc00084bf40 pc=0x40e778
github.com/couchbase/plasma.runCleanerAutoTuner()
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:653 +0x194 fp=0xc00084bfe0 sp=0xc00084bf68 pc=0xf4ef34
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00084bfe8 sp=0xc00084bfe0 pc=0x476581
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:239 +0x1fa

goroutine 24 [chan receive]:
runtime.gopark(0xc00009c480?, 0x1?, 0x20?, 0x87?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0000686c8 sp=0xc0000686a8 pc=0x444c16
runtime.chanrecv(0xc00009c420, 0xc0000687b0, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc000068758 sp=0xc0000686c8 pc=0x40ec3b
runtime.chanrecv2(0xdf8475800?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc000068780 sp=0xc000068758 pc=0x40e778
github.com/couchbase/plasma.singletonWorker()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:5021 +0xb0 fp=0xc0000687e0 sp=0xc000068780 pc=0xf564b0
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000687e8 sp=0xc0000687e0 pc=0x476581
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:240 +0x206

goroutine 25 [chan receive]:
runtime.gopark(0xc009f9bc20?, 0xc000068ef0?, 0xbb?, 0x15?, 0xc000103380?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000068ea8 sp=0xc000068e88 pc=0x444c16
runtime.chanrecv(0xc00009c4e0, 0xc000068fb0, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc000068f38 sp=0xc000068ea8 pc=0x40ec3b
runtime.chanrecv2(0x0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc000068f60 sp=0xc000068f38 pc=0x40e778
github.com/couchbase/plasma.systemResourceTracker()
	/opt/build/goproj/src/github.com/couchbase/plasma/mem.go:542 +0xa5 fp=0xc000068fe0 sp=0xc000068f60 pc=0xf60965
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000068fe8 sp=0xc000068fe0 pc=0x476581
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:241 +0x212

goroutine 26 [chan receive, 1 minutes]:
runtime.gopark(0x459d52?, 0xc00007bbe0?, 0xe6?, 0x29?, 0x461c65?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00007bbc0 sp=0xc00007bba0 pc=0x444c16
runtime.chanrecv(0xc00009c3c0, 0xc00007bf80, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00007bc50 sp=0xc00007bbc0 pc=0x40ec3b
runtime.chanrecv2(0xc0001681a0?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc00007bc78 sp=0xc00007bc50 pc=0x40e778
github.com/couchbase/plasma.AggregateAndLogStats()
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:4982 +0x147 fp=0xc00007bfe0 sp=0xc00007bc78 pc=0xf546c7
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00007bfe8 sp=0xc00007bfe0 pc=0x476581
created by github.com/couchbase/plasma.init.2
	/opt/build/goproj/src/github.com/couchbase/plasma/shard.go:242 +0x21e

goroutine 37 [chan receive]:
runtime.gopark(0xc0001146c0?, 0xc0001146b8?, 0xe?, 0x17?, 0x2?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00041eeb0 sp=0xc00041ee90 pc=0x444c16
runtime.chanrecv(0xc000114660, 0xc00041ef98, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00041ef40 sp=0xc00041eeb0 pc=0x40ec3b
runtime.chanrecv2(0x3b9aca00?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc00041ef68 sp=0xc00041ef40 pc=0x40e778
github.com/couchbase/plasma.(*CleanerAutoTuner).refreshCleanerBandwidth(0xc00010c500)
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:664 +0x86 fp=0xc00041efc8 sp=0xc00041ef68 pc=0xf4f0c6
github.com/couchbase/plasma.runCleanerAutoTuner.func2()
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:650 +0x26 fp=0xc00041efe0 sp=0xc00041efc8 pc=0xf4efa6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00041efe8 sp=0xc00041efe0 pc=0x476581
created by github.com/couchbase/plasma.runCleanerAutoTuner
	/opt/build/goproj/src/github.com/couchbase/plasma/auto_tuner.go:650 +0x136

goroutine 8 [chan receive]:
runtime.gopark(0xc00d0b3c20?, 0xc0002fcbf8?, 0xe?, 0x17?, 0x2?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0006d0e48 sp=0xc0006d0e28 pc=0x444c16
runtime.chanrecv(0xc0002fcba0, 0xc0006d0f38, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc0006d0ed8 sp=0xc0006d0e48 pc=0x40ec3b
runtime.chanrecv2(0x10be520?, 0xc0001b0240?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc0006d0f00 sp=0xc0006d0ed8 pc=0x40e778
github.com/couchbase/plasma.(*TenantMgr).Run(0xc0001681a0)
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:425 +0x9d fp=0xc0006d0fc8 sp=0xc0006d0f00 pc=0xf5aadd
github.com/couchbase/plasma.init.3.func1()
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:57 +0x26 fp=0xc0006d0fe0 sp=0xc0006d0fc8 pc=0xf59506
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0006d0fe8 sp=0xc0006d0fe0 pc=0x476581
created by github.com/couchbase/plasma.init.3
	/opt/build/goproj/src/github.com/couchbase/plasma/tenant.go:57 +0x5d

goroutine 69 [select, 40 minutes]:
runtime.gopark(0xc00007ef90?, 0x2?, 0xd8?, 0xed?, 0xc00007ef24?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00007ed90 sp=0xc00007ed70 pc=0x444c16
runtime.selectgo(0xc00007ef90, 0xc00007ef20, 0xc00028d240?, 0x0, 0xc0008dcf90?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00007eed0 sp=0xc00007ed90 pc=0x454a9c
net/http.(*persistConn).writeLoop(0xc0001a85a0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2392 +0xf5 fp=0xc00007efc8 sp=0xc00007eed0 pc=0x790fd5
net/http.(*Transport).dialConn.func6()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x26 fp=0xc00007efe0 sp=0xc00007efc8 pc=0x78dac6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00007efe8 sp=0xc00007efe0 pc=0x476581
created by net/http.(*Transport).dialConn
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x1791

goroutine 30 [chan receive, 41 minutes]:
runtime.gopark(0x10ad940?, 0xc0000696a0?, 0x60?, 0x38?, 0xc0000696d8?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000069678 sp=0xc000069658 pc=0x444c16
runtime.chanrecv(0xc000114c60, 0xc000069790, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc000069708 sp=0xc000069678 pc=0x40ec3b
runtime.chanrecv2(0xc000046470?, 0x16?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc000069730 sp=0xc000069708 pc=0x40e778
github.com/couchbase/goutils/systemeventlog.(*SystemEventLoggerImpl).logEvents(0xc000744000)
	/opt/build/goproj/src/github.com/couchbase/goutils/systemeventlog/system_event_logger.go:186 +0xb7 fp=0xc0000697c8 sp=0xc000069730 pc=0xc12f17
github.com/couchbase/goutils/systemeventlog.NewSystemEventLogger.func1()
	/opt/build/goproj/src/github.com/couchbase/goutils/systemeventlog/system_event_logger.go:125 +0x26 fp=0xc0000697e0 sp=0xc0000697c8 pc=0xc12826
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000697e8 sp=0xc0000697e0 pc=0x476581
created by github.com/couchbase/goutils/systemeventlog.NewSystemEventLogger
	/opt/build/goproj/src/github.com/couchbase/goutils/systemeventlog/system_event_logger.go:125 +0x1d6

goroutine 68 [select, 40 minutes]:
runtime.gopark(0xc00007af68?, 0x4?, 0x3?, 0x0?, 0xc00007adb0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00007ac00 sp=0xc00007abe0 pc=0x444c16
runtime.selectgo(0xc00007af68, 0xc00007ada8, 0xc0001d22c0?, 0x0, 0x454101?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00007ad40 sp=0xc00007ac00 pc=0x454a9c
net/http.(*persistConn).readLoop(0xc0001a85a0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2213 +0xd85 fp=0xc00007afc8 sp=0xc00007ad40 pc=0x78ff25
net/http.(*Transport).dialConn.func5()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1751 +0x26 fp=0xc00007afe0 sp=0xc00007afc8 pc=0x78db26
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00007afe8 sp=0xc00007afe0 pc=0x476581
created by net/http.(*Transport).dialConn
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1751 +0x173e

goroutine 603 [select, 40 minutes]:
runtime.gopark(0xc000525f90?, 0x2?, 0xd8?, 0x5d?, 0xc000525f24?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000525d90 sp=0xc000525d70 pc=0x444c16
runtime.selectgo(0xc000525f90, 0xc000525f20, 0xc009cd7a40?, 0x0, 0xc0066b1470?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc000525ed0 sp=0xc000525d90 pc=0x454a9c
net/http.(*persistConn).writeLoop(0xc001b0ec60)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2392 +0xf5 fp=0xc000525fc8 sp=0xc000525ed0 pc=0x790fd5
net/http.(*Transport).dialConn.func6()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x26 fp=0xc000525fe0 sp=0xc000525fc8 pc=0x78dac6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000525fe8 sp=0xc000525fe0 pc=0x476581
created by net/http.(*Transport).dialConn
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x1791

goroutine 198 [sync.Cond.Wait, 40 minutes]:
runtime.gopark(0x0?, 0xc00ab07e10?, 0x1f?, 0x58?, 0xc00ab07de8?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00007fd78 sp=0xc00007fd58 pc=0x444c16
runtime.goparkunlock(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:369
sync.runtime_notifyListWait(0xc005f1ab50, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/sema.go:517 +0x14c fp=0xc00007fdc0 sp=0xc00007fd78 pc=0x47266c
sync.(*Cond).Wait(0xc0090a52e0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/sync/cond.go:70 +0x8c fp=0xc00007fdf8 sp=0xc00007fdc0 pc=0x47dbcc
gopkg.in/couchbase/gocbcore%2ev7.(*memdOpQueue).pop(0xc009dfdc80, 0xc0090a52e0)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdopqueue.go:138 +0x54 fp=0xc00007fe48 sp=0xc00007fdf8 pc=0xbeead4
gopkg.in/couchbase/gocbcore%2ev7.(*memdOpConsumer).Pop(...)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdopqueue.go:28
gopkg.in/couchbase/gocbcore%2ev7.(*memdPipelineClient).ioLoop(0xc005f1ac40, 0xc00a914c80)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdpipelineclient.go:129 +0x505 fp=0xc00007ff48 sp=0xc00007fe48 pc=0xbf0385
gopkg.in/couchbase/gocbcore%2ev7.(*memdPipelineClient).Run(0xc005f1ac40)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdpipelineclient.go:195 +0x346 fp=0xc00007ffc8 sp=0xc00007ff48 pc=0xbf10a6
gopkg.in/couchbase/gocbcore%2ev7.(*memdPipeline).StartClients.func2()
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdpipeline.go:70 +0x26 fp=0xc00007ffe0 sp=0xc00007ffc8 pc=0xbef3c6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00007ffe8 sp=0xc00007ffe0 pc=0x476581
created by gopkg.in/couchbase/gocbcore%2ev7.(*memdPipeline).StartClients
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdpipeline.go:70 +0x91

goroutine 690 [select]:
runtime.gopark(0xc008fcbf98?, 0x2?, 0x3?, 0x30?, 0xc008fcbf94?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc008fcbe20 sp=0xc008fcbe00 pc=0x444c16
runtime.selectgo(0xc008fcbf98, 0xc008fcbf90, 0x2034940?, 0x0, 0x24143f6d194?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc008fcbf60 sp=0xc008fcbe20 pc=0x454a9c
gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).loggerRoutine(0xc008dd6000)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:244 +0xa5 fp=0xc008fcbfc8 sp=0xc008fcbf60 pc=0xc02c85
gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).startLoggerRoutine.func1()
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:239 +0x26 fp=0xc008fcbfe0 sp=0xc008fcbfc8 pc=0xc02ba6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc008fcbfe8 sp=0xc008fcbfe0 pc=0x476581
created by gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).startLoggerRoutine
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:239 +0x4aa

goroutine 107 [IO wait]:
runtime.gopark(0x2710?, 0xb?, 0x0?, 0x0?, 0x7?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00086bcb0 sp=0xc00086bc90 pc=0x444c16
runtime.netpollblock(0x4897c5?, 0x454193?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:526 +0xf7 fp=0xc00086bce8 sp=0xc00086bcb0 pc=0x43d3f7
internal/poll.runtime_pollWait(0x7f05f8758198, 0x72)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:305 +0x89 fp=0xc00086bd08 sp=0xc00086bce8 pc=0x470a09
internal/poll.(*pollDesc).wait(0xc000293780?, 0xc009ab73a0?, 0x0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc00086bd30 sp=0xc00086bd08 pc=0x4ab6f2
internal/poll.(*pollDesc).waitRead(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000293780, {0xc009ab73a0, 0x8, 0x8})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_unix.go:167 +0x25a fp=0xc00086bdb0 sp=0xc00086bd30 pc=0x4aca5a
net.(*netFD).Read(0xc000293780, {0xc009ab73a0?, 0xc00086be30?, 0x459d52?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/fd_posix.go:55 +0x29 fp=0xc00086bdf8 sp=0xc00086bdb0 pc=0x68ae29
net.(*conn).Read(0xc0002867b0, {0xc009ab73a0?, 0xc00086bed8?, 0xc119ef?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/net.go:183 +0x45 fp=0xc00086be40 sp=0xc00086bdf8 pc=0x69b0a5
github.com/couchbase/gometa/common.(*PeerPipe).readBytes(0xc0008acf80, 0x15ced60?, {0x0?, 0xc00afc7cd0?, 0x1?})
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:313 +0xcf fp=0xc00086bee8 sp=0xc00086be40 pc=0xc11b2f
github.com/couchbase/gometa/common.(*PeerPipe).doReceive(0xc0008acf80)
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:214 +0xd5 fp=0xc00086bfc8 sp=0xc00086bee8 pc=0xc10c95
github.com/couchbase/gometa/common.NewPeerPipe.func2()
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:66 +0x26 fp=0xc00086bfe0 sp=0xc00086bfc8 pc=0xc0fd06
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00086bfe8 sp=0xc00086bfe0 pc=0x476581
created by github.com/couchbase/gometa/common.NewPeerPipe
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:66 +0x18a

goroutine 73 [select]:
runtime.gopark(0xc02177bf98?, 0x2?, 0xb1?, 0x7a?, 0xc02177bf8c?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc02177be18 sp=0xc02177bdf8 pc=0x444c16
runtime.selectgo(0xc02177bf98, 0xc02177bf88, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc02177bf58 sp=0xc02177be18 pc=0x454a9c
github.com/couchbase/indexing/secondary/queryport/client.(*schedTokenMonitor).updater(0xc0007925d0)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:2396 +0x92 fp=0xc02177bfc8 sp=0xc02177bf58 pc=0xd63572
github.com/couchbase/indexing/secondary/queryport/client.newSchedTokenMonitor.func1()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:2189 +0x26 fp=0xc02177bfe0 sp=0xc02177bfc8 pc=0xd61ee6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc02177bfe8 sp=0xc02177bfe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/queryport/client.newSchedTokenMonitor
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:2189 +0x296

goroutine 78 [select, 4 minutes]:
runtime.gopark(0xc000585d08?, 0x4?, 0x2?, 0x0?, 0xc000585c90?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000585b08 sp=0xc000585ae8 pc=0x444c16
runtime.selectgo(0xc000585d08, 0xc000585c88, 0xc009c7db30?, 0x0, 0xc000792c01?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc000585c48 sp=0xc000585b08 pc=0x454a9c
github.com/couchbase/gometa/protocol.runWatcher(0xc0001d3280, {0x15c5aa0, 0xc0005600c0}, {0x15d76f8?, 0xc0005600c0}, {0x15d5560?, 0x20706d8}, 0xc00054c150, 0xc000119740, 0xc00054c1c0, ...)
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/watcherServer.go:357 +0x3be fp=0xc000585d60 sp=0xc000585c48 pc=0xcc769e
github.com/couchbase/gometa/protocol.runOnce(0x1, {0xc00052a8d0, 0xe}, {0x15c5aa0, 0xc0005600c0}, {0x15d76f8, 0xc0005600c0}, {0x15d5560, 0x20706d8}, 0xc00054c150, ...)
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/watcherServer.go:213 +0x525 fp=0xc000585e90 sp=0xc000585d60 pc=0xcc6745
github.com/couchbase/gometa/protocol.RunWatcherServerWithRequest2({0xc00052a8d0, 0xe}, {0x15c5aa0, 0xc0005600c0}, {0x15d76f8, 0xc0005600c0}, {0x15d5560, 0x20706d8}, 0xc00054c150, 0xc000119740, ...)
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/watcherServer.go:65 +0x174 fp=0xc000585f68 sp=0xc000585e90 pc=0xcc6074
github.com/couchbase/indexing/secondary/manager/client.(*MetadataProvider).startWatcher.func1()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/client/metadata_provider.go:4315 +0x9f fp=0xc000585fe0 sp=0xc000585f68 pc=0xd23fff
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000585fe8 sp=0xc000585fe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/manager/client.(*MetadataProvider).startWatcher
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/client/metadata_provider.go:4315 +0x1f8

goroutine 38 [select]:
runtime.gopark(0xc000883ba0?, 0x3?, 0x0?, 0x30?, 0xc000883b1a?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000883978 sp=0xc000883958 pc=0x444c16
runtime.selectgo(0xc000883ba0, 0xc000883b14, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc000883ab8 sp=0xc000883978 pc=0x454a9c
github.com/couchbase/cbauth/metakv.doRunObserveChildren(0xc0002842b0?, {0x1291522, 0x1b}, 0xc000883e68, 0xc000119260)
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:301 +0x429 fp=0xc000883e40 sp=0xc000883ab8 pc=0xa348c9
github.com/couchbase/cbauth/metakv.(*store).runObserveChildren(...)
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:259
github.com/couchbase/cbauth/metakv.RunObserveChildren({0x1291522?, 0x0?}, 0x0?, 0x0?)
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:389 +0x58 fp=0xc000883e88 sp=0xc000883e40 pc=0xa34e58
github.com/couchbase/indexing/secondary/manager/common.(*CommandListener).ListenTokens.func2.1(0x0?, {0x0?, 0x0?})
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/common/token.go:1607 +0xc7 fp=0xc000883f00 sp=0xc000883e88 pc=0xc17ee7
github.com/couchbase/indexing/secondary/common.(*RetryHelper).Run(0xc000883fa0)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/common/retry_helper.go:36 +0x83 fp=0xc000883f38 sp=0xc000883f00 pc=0xb58b23
github.com/couchbase/indexing/secondary/manager/common.(*CommandListener).ListenTokens.func2()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/common/token.go:1612 +0xdf fp=0xc000883fe0 sp=0xc000883f38 pc=0xc17d9f
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000883fe8 sp=0xc000883fe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/manager/common.(*CommandListener).ListenTokens
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/common/token.go:1600 +0xaf

goroutine 120 [sleep]:
runtime.gopark(0x27768b5c20c8?, 0xc0006d0718?, 0x85?, 0xf1?, 0x10?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0006d06e8 sp=0xc0006d06c8 pc=0x444c16
time.Sleep(0x3b9aca00)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/time.go:195 +0x135 fp=0xc0006d0728 sp=0xc0006d06e8 pc=0x4733f5
github.com/couchbase/indexing/secondary/queryport/client.(*connectionPool).releaseConnsRoutine(0xc00014cd10)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/conn_pool.go:461 +0xa5 fp=0xc0006d07c8 sp=0xc0006d0728 pc=0xd540a5
github.com/couchbase/indexing/secondary/queryport/client.newConnectionPool.func1()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/conn_pool.go:101 +0x26 fp=0xc0006d07e0 sp=0xc0006d07c8 pc=0xd51306
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0006d07e8 sp=0xc0006d07e0 pc=0x476581
created by github.com/couchbase/indexing/secondary/queryport/client.newConnectionPool
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/conn_pool.go:101 +0x4b6

goroutine 44 [chan receive]:
runtime.gopark(0xc00b2ca180?, 0xc00a715f90?, 0x20?, 0x5f?, 0xc005237ec0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00a715ec8 sp=0xc00a715ea8 pc=0x444c16
runtime.chanrecv(0xc00009c7e0, 0x0, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc00a715f58 sp=0xc00a715ec8 pc=0x40ec3b
runtime.chanrecv1(0xc00014cf20?, 0xc000055ec0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:442 +0x18 fp=0xc00a715f80 sp=0xc00a715f58 pc=0x40e738
github.com/couchbase/indexing/secondary/queryport/client.(*metadataClient).logstats(0xc00014cf20)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:1412 +0x79 fp=0xc00a715fc8 sp=0xc00a715f80 pc=0xd5c019
github.com/couchbase/indexing/secondary/queryport/client.newMetaBridgeClient.func2()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:152 +0x26 fp=0xc00a715fe0 sp=0xc00a715fc8 pc=0xd54986
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00a715fe8 sp=0xc00a715fe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/queryport/client.newMetaBridgeClient
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:152 +0x616

goroutine 74 [select, 41 minutes]:
runtime.gopark(0xc0001e8f68?, 0x4?, 0x3?, 0x0?, 0xc0001e8db0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0001e8c00 sp=0xc0001e8be0 pc=0x444c16
runtime.selectgo(0xc0001e8f68, 0xc0001e8da8, 0xc00009f900?, 0x0, 0xc0006cdd01?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc0001e8d40 sp=0xc0001e8c00 pc=0x454a9c
net/http.(*persistConn).readLoop(0xc000542000)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2213 +0xd85 fp=0xc0001e8fc8 sp=0xc0001e8d40 pc=0x78ff25
net/http.(*Transport).dialConn.func5()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1751 +0x26 fp=0xc0001e8fe0 sp=0xc0001e8fc8 pc=0x78db26
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0001e8fe8 sp=0xc0001e8fe0 pc=0x476581
created by net/http.(*Transport).dialConn
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1751 +0x173e

goroutine 75 [select, 41 minutes]:
runtime.gopark(0xc0001e9f90?, 0x2?, 0xd8?, 0x9d?, 0xc0001e9f24?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0001e9d90 sp=0xc0001e9d70 pc=0x444c16
runtime.selectgo(0xc0001e9f90, 0xc0001e9f20, 0xc00028db80?, 0x0, 0xc000792960?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc0001e9ed0 sp=0xc0001e9d90 pc=0x454a9c
net/http.(*persistConn).writeLoop(0xc000542000)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2392 +0xf5 fp=0xc0001e9fc8 sp=0xc0001e9ed0 pc=0x790fd5
net/http.(*Transport).dialConn.func6()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x26 fp=0xc0001e9fe0 sp=0xc0001e9fc8 pc=0x78dac6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0001e9fe8 sp=0xc0001e9fe0 pc=0x476581
created by net/http.(*Transport).dialConn
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1752 +0x1791

goroutine 171 [chan send, 40 minutes]:
runtime.gopark(0xc009f5d470?, 0xc0067acb40?, 0x2?, 0x0?, 0xc0008ffa90?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0008ffa50 sp=0xc0008ffa30 pc=0x444c16
runtime.chansend(0xc00a22ecc0, 0xc0008ffb38, 0x1, 0x73613a726f746172?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:259 +0x42c fp=0xc0008ffad8 sp=0xc0008ffa50 pc=0x40de2c
runtime.chansend1(0x7473696e696d6441?, 0x73613a726f746172?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:145 +0x1d fp=0xc0008ffb08 sp=0xc0008ffad8 pc=0x40d9dd
gopkg.in/couchbase/gocbcore%2ev7.(*Agent).connect.func2(0xc00a231f80?, {0x0?, 0x12a8a75?}, {0x15c11c0?, 0xc0002a2640?})
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agent.go:879 +0x128 fp=0xc0008ffb58 sp=0xc0008ffb08 pc=0xbcf128
gopkg.in/couchbase/gocbcore%2ev7.(*Agent).httpLooper.func1(0x1)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agenthttpcfg.go:136 +0x41e fp=0xc0008ffc50 sp=0xc0008ffb58 pc=0xbd36be
gopkg.in/couchbase/gocbcore%2ev7.(*Agent).httpLooper.func1(0x0)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agenthttpcfg.go:140 +0x44a fp=0xc0008ffd48 sp=0xc0008ffc50 pc=0xbd36ea
gopkg.in/couchbase/gocbcore%2ev7.(*Agent).httpLooper(0xc00087d680, 0xc005425480)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agenthttpcfg.go:148 +0x354 fp=0xc0008fffc0 sp=0xc0008ffd48 pc=0xbd27b4
gopkg.in/couchbase/gocbcore%2ev7.(*Agent).connect.func4()
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agent.go:877 +0x2a fp=0xc0008fffe0 sp=0xc0008fffc0 pc=0xbcefca
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0008fffe8 sp=0xc0008fffe0 pc=0x476581
created by gopkg.in/couchbase/gocbcore%2ev7.(*Agent).connect
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/agent.go:877 +0x31b

goroutine 65 [IO wait]:
runtime.gopark(0xd52?, 0xb?, 0x0?, 0x0?, 0xb?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00007d9a0 sp=0xc00007d980 pc=0x444c16
runtime.netpollblock(0x4897c5?, 0x4ab3e5?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:526 +0xf7 fp=0xc00007d9d8 sp=0xc00007d9a0 pc=0x43d3f7
internal/poll.runtime_pollWait(0x7f05f8758738, 0x72)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:305 +0x89 fp=0xc00007d9f8 sp=0xc00007d9d8 pc=0x470a09
internal/poll.(*pollDesc).wait(0xc000544000?, 0xc0007a2000?, 0x0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc00007da20 sp=0xc00007d9f8 pc=0x4ab6f2
internal/poll.(*pollDesc).waitRead(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000544000, {0xc0007a2000, 0x1000, 0x1000})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_unix.go:167 +0x25a fp=0xc00007daa0 sp=0xc00007da20 pc=0x4aca5a
net.(*netFD).Read(0xc000544000, {0xc0007a2000?, 0x1c0002a36d0?, 0x57ed3f?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/fd_posix.go:55 +0x29 fp=0xc00007dae8 sp=0xc00007daa0 pc=0x68ae29
net.(*conn).Read(0xc000128808, {0xc0007a2000?, 0x56f446?, 0xc000296de8?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/net.go:183 +0x45 fp=0xc00007db30 sp=0xc00007dae8 pc=0x69b0a5
net/http.(*persistConn).Read(0xc000542000, {0xc0007a2000?, 0x105d3e0?, 0xc00007de08?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1929 +0x4e fp=0xc00007db90 sp=0xc00007db30 pc=0x78e52e
bufio.(*Reader).fill(0xc00009d4a0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/bufio/bufio.go:106 +0xff fp=0xc00007dbc8 sp=0xc00007db90 pc=0x52c3bf
bufio.(*Reader).ReadSlice(0xc00009d4a0, 0x81?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/bufio/bufio.go:372 +0x2f fp=0xc00007dc18 sp=0xc00007dbc8 pc=0x52cfaf
net/http/internal.readChunkLine(0xda4?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:129 +0x25 fp=0xc00007dc68 sp=0xc00007dc18 pc=0x72bc85
net/http/internal.(*chunkedReader).beginChunk(0xc0005be930)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:48 +0x28 fp=0xc00007dc98 sp=0xc00007dc68 pc=0x72b6e8
net/http/internal.(*chunkedReader).Read(0xc0005be930, {0xc008ec5000?, 0xc0001aaf30?, 0xc00a6ae990?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:98 +0x156 fp=0xc00007dd18 sp=0xc00007dc98 pc=0x72b9b6
net/http.(*body).readLocked(0xc00009f900, {0xc008ec5000?, 0xc000296e50?, 0x1168cc0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transfer.go:846 +0x3c fp=0xc00007dd68 sp=0xc00007dd18 pc=0x78305c
net/http.(*body).Read(0x0?, {0xc008ec5000?, 0x56f03e?, 0x0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transfer.go:838 +0x125 fp=0xc00007dde0 sp=0xc00007dd68 pc=0x782f25
net/http.(*bodyEOFSignal).Read(0xc00009f940, {0xc008ec5000, 0xe00, 0xe00})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2774 +0x142 fp=0xc00007de60 sp=0xc00007dde0 pc=0x792c02
encoding/json.(*Decoder).refill(0xc000296dc0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:165 +0x188 fp=0xc00007deb0 sp=0xc00007de60 pc=0x582168
encoding/json.(*Decoder).readValue(0xc000296dc0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:140 +0xbb fp=0xc00007df00 sp=0xc00007deb0 pc=0x581d5b
encoding/json.(*Decoder).Decode(0xc000296dc0, {0x10366e0, 0xc0002a36d0})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:63 +0x78 fp=0xc00007df30 sp=0xc00007df00 pc=0x581998
github.com/couchbase/cbauth/metakv.doRunObserveChildren.func1()
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:284 +0x10b fp=0xc00007dfe0 sp=0xc00007df30 pc=0xa34d4b
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00007dfe8 sp=0xc00007dfe0 pc=0x476581
created by github.com/couchbase/cbauth/metakv.doRunObserveChildren
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:280 +0x2eb

goroutine 576 [select, 40 minutes]:
runtime.gopark(0xc00afdfd88?, 0x2?, 0x60?, 0xda?, 0xc00afdfc84?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00afdfaf0 sp=0xc00afdfad0 pc=0x444c16
runtime.selectgo(0xc00afdfd88, 0xc00afdfc80, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00afdfc30 sp=0xc00afdfaf0 pc=0x454a9c
github.com/couchbase/indexing/secondary/dcp.(*Client).runObserveStreamingEndpoint(0x0?, {0xc009d42db0, 0x17}, 0xc009d39b78, 0xc009d39b60, 0xc00aa70660)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/dcp/pools.go:503 +0x6da fp=0xc00afdfed8 sp=0xc00afdfc30 pc=0x84d29a
github.com/couchbase/indexing/secondary/dcp.(*Client).RunObservePool(0xc009d7a000?, {0x1276c0b?, 0xc0006cbf88?}, 0x12b34e9?, 0x0?)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/dcp/pools.go:395 +0xb2 fp=0xc00afdff40 sp=0xc00afdfed8 pc=0x84c3d2
github.com/couchbase/indexing/secondary/common.(*serviceNotifierInstance).RunPoolObserver(0xc009d7a000)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/common/services_notifier.go:147 +0x5c fp=0xc00afdffc8 sp=0xc00afdff40 pc=0xb5a93c
github.com/couchbase/indexing/secondary/common.NewServicesChangeNotifier.func2()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/common/services_notifier.go:256 +0x26 fp=0xc00afdffe0 sp=0xc00afdffc8 pc=0xb5bc26
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00afdffe8 sp=0xc00afdffe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/common.NewServicesChangeNotifier
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/common/services_notifier.go:256 +0x50a

goroutine 170 [select]:
runtime.gopark(0xc006a7df98?, 0x2?, 0x3?, 0x30?, 0xc006a7df94?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc006a7de20 sp=0xc006a7de00 pc=0x444c16
runtime.selectgo(0xc006a7df98, 0xc006a7df90, 0x2034940?, 0x0, 0x241e20c156b?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc006a7df60 sp=0xc006a7de20 pc=0x454a9c
gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).loggerRoutine(0xc009e881e0)
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:244 +0xa5 fp=0xc006a7dfc8 sp=0xc006a7df60 pc=0xc02c85
gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).startLoggerRoutine.func1()
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:239 +0x26 fp=0xc006a7dfe0 sp=0xc006a7dfc8 pc=0xc02ba6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc006a7dfe8 sp=0xc006a7dfe0 pc=0x476581
created by gopkg.in/couchbase/gocb%2ev1.(*ThresholdLoggingTracer).startLoggerRoutine
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocb.v1@v1.6.7/thresholdlogtracer.go:239 +0x4aa

goroutine 96 [select]:
runtime.gopark(0xc010145f90?, 0x2?, 0xd0?, 0xb9?, 0xc010145f4c?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc010145dc8 sp=0xc010145da8 pc=0x444c16
runtime.selectgo(0xc010145f90, 0xc010145f48, 0xc00cdd1a10?, 0x0, 0x101000101000001?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc010145f08 sp=0xc010145dc8 pc=0x454a9c
github.com/couchbase/gometa/protocol.(*Follower).startListener(0xc0006fc380)
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/follower.go:149 +0xe5 fp=0xc010145fc8 sp=0xc010145f08 pc=0xcc4bc5
github.com/couchbase/gometa/protocol.(*Follower).Start.func1()
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/follower.go:76 +0x26 fp=0xc010145fe0 sp=0xc010145fc8 pc=0xcc47c6
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc010145fe8 sp=0xc010145fe0 pc=0x476581
created by github.com/couchbase/gometa/protocol.(*Follower).Start
	/opt/build/goproj/src/github.com/couchbase/gometa/protocol/follower.go:76 +0x56

goroutine 97 [select]:
runtime.gopark(0xc00041ff90?, 0x2?, 0xe3?, 0x4a?, 0xc00041ff74?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc00041fdf8 sp=0xc00041fdd8 pc=0x444c16
runtime.selectgo(0xc00041ff90, 0xc00041ff70, 0x0?, 0x0, 0x0?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc00041ff38 sp=0xc00041fdf8 pc=0x454a9c
github.com/couchbase/indexing/secondary/manager/client.(*watcher).timeoutChecker(0xc0005600c0)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/client/metadata_provider.go:6530 +0xcd fp=0xc00041ffc8 sp=0xc00041ff38 pc=0xd3238d
github.com/couchbase/indexing/secondary/manager/client.(*watcher).startTimer.func1()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/client/metadata_provider.go:6521 +0x26 fp=0xc00041ffe0 sp=0xc00041ffc8 pc=0xd32286
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00041ffe8 sp=0xc00041ffe0 pc=0x476581
created by github.com/couchbase/indexing/secondary/manager/client.(*watcher).startTimer
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/manager/client/metadata_provider.go:6521 +0x8d

goroutine 1067 [select, 39 minutes]:
runtime.gopark(0xc0085747b0?, 0x2?, 0xb0?, 0x47?, 0xc008574774?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0085745f8 sp=0xc0085745d8 pc=0x444c16
runtime.selectgo(0xc0085747b0, 0xc008574770, 0xc0085747b0?, 0x0, 0x12cf6c4?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc008574738 sp=0xc0085745f8 pc=0x454a9c
gopkg.in/couchbase/gocbcore%2ev7.(*memdClient).run.func1()
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdclient.go:257 +0xa6 fp=0xc0085747e0 sp=0xc008574738 pc=0xbeb566
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0085747e8 sp=0xc0085747e0 pc=0x476581
created by gopkg.in/couchbase/gocbcore%2ev7.(*memdClient).run
	/opt/build/go/pkg/mod/gopkg.in/couchbase/gocbcore.v7@v7.1.18/memdclient.go:255 +0xea

goroutine 94 [chan receive, 4 minutes]:
runtime.gopark(0x4aec20?, 0xc00012b680?, 0x28?, 0x5e?, 0xc000865e88?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc000865e28 sp=0xc000865e08 pc=0x444c16
runtime.chanrecv(0xc00009d680, 0xc000865f90, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:583 +0x49b fp=0xc000865eb8 sp=0xc000865e28 pc=0x40ec3b
runtime.chanrecv2(0xc000128828?, 0xc00c37dac0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/chan.go:447 +0x18 fp=0xc000865ee0 sp=0xc000865eb8 pc=0x40e778
github.com/couchbase/gometa/common.(*PeerPipe).doSend(0xc0001d3280)
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:167 +0x9e fp=0xc000865fc8 sp=0xc000865ee0 pc=0xc104be
github.com/couchbase/gometa/common.NewPeerPipe.func1()
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:65 +0x26 fp=0xc000865fe0 sp=0xc000865fc8 pc=0xc0fd66
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000865fe8 sp=0xc000865fe0 pc=0x476581
created by github.com/couchbase/gometa/common.NewPeerPipe
	/opt/build/goproj/src/github.com/couchbase/gometa/common/peerPipe.go:65 +0x14d

goroutine 213 [IO wait]:
runtime.gopark(0xd52?, 0xb?, 0x0?, 0x0?, 0x1b?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc0005279a0 sp=0xc000527980 pc=0x444c16
runtime.netpollblock(0x4897c5?, 0x4ab3e5?, 0x0?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:526 +0xf7 fp=0xc0005279d8 sp=0xc0005279a0 pc=0x43d3f7
internal/poll.runtime_pollWait(0x7f05f8758288, 0x72)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/netpoll.go:305 +0x89 fp=0xc0005279f8 sp=0xc0005279d8 pc=0x470a09
internal/poll.(*pollDesc).wait(0xc00a6ad780?, 0xc00aa6b000?, 0x0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc000527a20 sp=0xc0005279f8 pc=0x4ab6f2
internal/poll.(*pollDesc).waitRead(...)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00a6ad780, {0xc00aa6b000, 0x1000, 0x1000})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/internal/poll/fd_unix.go:167 +0x25a fp=0xc000527aa0 sp=0xc000527a20 pc=0x4aca5a
net.(*netFD).Read(0xc00a6ad780, {0xc00aa6b000?, 0x1c00a5f0f00?, 0x57ed3f?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/fd_posix.go:55 +0x29 fp=0xc000527ae8 sp=0xc000527aa0 pc=0x68ae29
net.(*conn).Read(0xc000014740, {0xc00aa6b000?, 0x56f446?, 0xc00a7768e8?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/net.go:183 +0x45 fp=0xc000527b30 sp=0xc000527ae8 pc=0x69b0a5
net/http.(*persistConn).Read(0xc00aa6c5a0, {0xc00aa6b000?, 0x105d3e0?, 0xc000527e08?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:1929 +0x4e fp=0xc000527b90 sp=0xc000527b30 pc=0x78e52e
bufio.(*Reader).fill(0xc00a854720)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/bufio/bufio.go:106 +0xff fp=0xc000527bc8 sp=0xc000527b90 pc=0x52c3bf
bufio.(*Reader).ReadSlice(0xc00a854720, 0x81?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/bufio/bufio.go:372 +0x2f fp=0xc000527c18 sp=0xc000527bc8 pc=0x52cfaf
net/http/internal.readChunkLine(0xda4?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:129 +0x25 fp=0xc000527c68 sp=0xc000527c18 pc=0x72bc85
net/http/internal.(*chunkedReader).beginChunk(0xc00a889950)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:48 +0x28 fp=0xc000527c98 sp=0xc000527c68 pc=0x72b6e8
net/http/internal.(*chunkedReader).Read(0xc00a889950, {0xc008ed8000?, 0xc0001aaf30?, 0xc008eb16e0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/internal/chunked.go:98 +0x156 fp=0xc000527d18 sp=0xc000527c98 pc=0x72b9b6
net/http.(*body).readLocked(0xc00a884600, {0xc008ed8000?, 0xc00a776950?, 0x1168cc0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transfer.go:846 +0x3c fp=0xc000527d68 sp=0xc000527d18 pc=0x78305c
net/http.(*body).Read(0x0?, {0xc008ed8000?, 0x56f03e?, 0x0?})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transfer.go:838 +0x125 fp=0xc000527de0 sp=0xc000527d68 pc=0x782f25
net/http.(*bodyEOFSignal).Read(0xc00a884640, {0xc008ed8000, 0xe00, 0xe00})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/net/http/transport.go:2774 +0x142 fp=0xc000527e60 sp=0xc000527de0 pc=0x792c02
encoding/json.(*Decoder).refill(0xc00a7768c0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:165 +0x188 fp=0xc000527eb0 sp=0xc000527e60 pc=0x582168
encoding/json.(*Decoder).readValue(0xc00a7768c0)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:140 +0xbb fp=0xc000527f00 sp=0xc000527eb0 pc=0x581d5b
encoding/json.(*Decoder).Decode(0xc00a7768c0, {0x10366e0, 0xc00a5f0f00})
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/encoding/json/stream.go:63 +0x78 fp=0xc000527f30 sp=0xc000527f00 pc=0x581998
github.com/couchbase/cbauth/metakv.doRunObserveChildren.func1()
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:284 +0x10b fp=0xc000527fe0 sp=0xc000527f30 pc=0xa34d4b
runtime.goexit()
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000527fe8 sp=0xc000527fe0 pc=0x476581
created by github.com/couchbase/cbauth/metakv.doRunObserveChildren
	/opt/build/goproj/src/github.com/couchbase/cbauth/metakv/metakv.go:280 +0x2eb

goroutine 135 [select]:
runtime.gopark(0xc008202f98?, 0x2?, 0xb1?, 0x7a?, 0xc008202f8c?)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/proc.go:363 +0xd6 fp=0xc008202e18 sp=0xc008202df8 pc=0x444c16
runtime.selectgo(0xc008202f98, 0xc008202f88, 0x1273200?, 0x0, 0xc005f1ad40?, 0x1)
	/home/buildbot/.cbdepscache/exploded/x86_64/go-1.19.2/go/src/runtime/select.go:328 +0x7bc fp=0xc008202f58 sp=0xc008202e18 pc=0x454a9c
github.com/couchbase/indexing/secondary/queryport/client.(*schedTokenMonitor).updater(0xc00a856c60)
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/queryport/client/meta_client.go:2396 +0x92 fp=0xc008202fc8 sp=0xc008202f58 pc=0xd63572
github.com/couchbase/indexing/secondary/queryport/client.newSchedTokenMonitor.func1()
	/opt/build/goproj/src/github.com/couchbase/indexing/secondary/que