Skip to content

[CI] EsqlSpecIT class failing #125170

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
elasticsearchmachine opened this issue Mar 18, 2025 · 2 comments
Open

[CI] EsqlSpecIT class failing #125170

elasticsearchmachine opened this issue Mar 18, 2025 · 2 comments
Assignees
Labels
:Analytics/ES|QL AKA ESQL medium-risk An open issue or test failure that is a medium risk to future releases Team:Analytics Meta label for analytical engine team (ESQL/Aggs/Geo) >test-failure Triaged test failures from CI

Comments

@elasticsearchmachine
Copy link
Collaborator

elasticsearchmachine commented Mar 18, 2025

Build Scans:

Reproduction Line:

./gradlew ":x-pack:plugin:esql:qa:server:multi-node:javaRestTest" --tests "org.elasticsearch.xpack.esql.qa.multi_node.EsqlSpecIT" -Dtests.method="test {match-function.TestMultiValuedFieldWithConjunction SYNC}" -Dtests.seed=2FC72A5116028313 -Dtests.locale=ca-AD -Dtests.timezone=America/Rio_Branco -Druntime.java=24

Applicable branches:
8.19

Reproduces locally?:
N/A

Failure History:
See dashboard

Failure Message:

java.net.ConnectException: Connection refused

Issue Reasons:

  • [8.19] 10 failures in class org.elasticsearch.xpack.esql.qa.multi_node.EsqlSpecIT (1.3% fail rate in 752 executions)
  • [8.19] 7 failures in pipeline elasticsearch-periodic-platform-support (33.3% fail rate in 21 executions)
  • [8.19] 2 failures in pipeline elasticsearch-periodic (9.5% fail rate in 21 executions)

Note:
This issue was created using new test triage automation. Please report issues or feedback to es-delivery.

@elasticsearchmachine elasticsearchmachine added :Analytics/ES|QL AKA ESQL >test-failure Triaged test failures from CI Team:Analytics Meta label for analytical engine team (ESQL/Aggs/Geo) needs:risk Requires assignment of a risk label (low, medium, blocker) labels Mar 18, 2025
@elasticsearchmachine
Copy link
Collaborator Author

Pinging @elastic/es-analytical-engine (Team:Analytics)

@dnhatn
Copy link
Member

dnhatn commented Mar 18, 2025

@ioanatia The semantic_text issue just happened again.

[2025-03-18T22:18:49,193][ERROR][o.e.b.ElasticsearchUncaughtExceptionHandler] [test-cluster-0] fatal error in thread [elasticsearch[test-cluster-0][generic][T#12]], exiting java.lang.AssertionError: java.lang.AssertionError: seqNo [0] was processed twice in generation [2], with different data. prvOp [Index{id='1', seqNo=0, primaryTerm=1, version=1, autoGeneratedIdTimestamp=-1} source: {st_logs=2024-12-23T12:15:00.000Z 1.2.3.4 [email protected] 4553, st_double=5.20128E11, st_geopoint=POINT(42.97109630194 14.7552534413725), _inference_fields={st_logs={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_logs=[{start_offset=0, end_offset=57, embeddings={feature_0=9.994094E7, feature_1=9.994094E7, feature_2=9.994094E7, feature_3=9.994095E7, feature_4=9.994095E7}}]}}}, st_geopoint={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geopoint=[{start_offset=0, end_offset=38, embeddings={feature_0=4.8845428E7, feature_1=4.8845428E7, feature_2=4.8845428E7, feature_3=4.884543E7, feature_4=4.884543E7}}]}}}, st_double={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_double=[{start_offset=0, end_offset=10, embeddings={feature_0=1.02139264E9, feature_1=1.02139264E9, feature_2=1.02139264E9, feature_3=1.02139264E9, feature_4=1.02139264E9}}]}}}, st_integer={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_integer=[{start_offset=0, end_offset=2, embeddings={feature_0=1602.0, feature_1=1603.0, feature_2=1604.0, feature_3=1605.0, feature_4=1606.0}}]}}}, st_geoshape={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geoshape=[{start_offset=0, end_offset=45, embeddings={feature_0=2.9025277E8, feature_1=2.9025277E8, feature_2=2.9025277E8, feature_3=2.9025277E8, feature_4=2.9025277E8}}]}}}, st_unicode={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unicode=[{start_offset=0, end_offset=5, embeddings={feature_0=2.0297848E9, feature_1=2.0297848E9, feature_2=2.0297848E9, feature_3=2.0297848E9, feature_4=2.0297848E9}}]}}}, st_datetime={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_datetime=[{start_offset=0, end_offset=24, embeddings={feature_0=1.7914711E9, feature_1=1.7914711E9, feature_2=1.7914711E9, feature_3=1.7914711E9, feature_4=1.7914711E9}}]}}}, st_unsigned_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unsigned_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.597711E8, feature_1=3.597711E8, feature_2=3.597711E8, feature_3=3.597711E8, feature_4=3.597711E8}}]}}}, st_bool={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_bool=[{start_offset=0, end_offset=5, embeddings={feature_0=9.719632E7, feature_1=9.719633E7, feature_2=9.719633E7, feature_3=9.719633E7, feature_4=9.719633E7}}]}}}, st_ip={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_ip=[{start_offset=0, end_offset=7, embeddings={feature_0=1.9016198E9, feature_1=1.9016198E9, feature_2=1.9016198E9, feature_3=1.9016198E9, feature_4=1.9016198E9}}]}}}, st_base64={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_base64=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4194486E9, feature_1=1.4194486E9, feature_2=1.4194486E9, feature_3=1.4194486E9, feature_4=1.4194486E9}}]}}}, st_multi_value={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_multi_value=[{start_offset=0, end_offset=12, embeddings={feature_0=1.457005E8, feature_1=1.457005E8, feature_2=1.4570051E8, feature_3=1.4570051E8, feature_4=1.4570051E8}}, {start_offset=13, end_offset=35, embeddings={feature_0=8.689444E8, feature_1=8.689444E8, feature_2=8.689444E8, feature_3=8.689444E8, feature_4=8.689444E8}}, {start_offset=36, end_offset=56, embeddings={feature_0=1.5083834E9, feature_1=1.5083834E9, feature_2=1.5083834E9, feature_3=1.5083834E9, feature_4=1.5083834E9}}]}}}, st_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.597711E8, feature_1=3.597711E8, feature_2=3.597711E8, feature_3=3.597711E8, feature_4=3.597711E8}}]}}}, semantic_text_field={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={semantic_text_field=[{start_offset=0, end_offset=21, embeddings={feature_0=2.9864544E8, feature_1=2.9864544E8, feature_2=2.9864544E8, feature_3=2.9864544E8, feature_4=2.9864544E8}}]}}}, st_cartesian_point={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_cartesian_point=[{start_offset=0, end_offset=23, embeddings={feature_0=2.032471E9, feature_1=2.032471E9, feature_2=2.032471E9, feature_3=2.032471E9, feature_4=2.032471E9}}]}}}, st_version={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_version=[{start_offset=0, end_offset=5, embeddings={feature_0=4.6672444E7, feature_1=4.6672444E7, feature_2=4.6672444E7, feature_3=4.667245E7, feature_4=4.667245E7}}]}}}}, st_integer=23, st_geoshape=POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10)), description=some description1, st_datetime=1953-09-02T00:00:00.000Z, st_unicode=你吃饭了吗, st_unsigned_long=2147483648, st_bool=false, st_ip=1.1.1.1, st_base64=ZWxhc3RpYw==, st_multi_value=[Hello there!, This is a random value, for testing purposes], st_long=2147483648, host=host1, semantic_text_field=live long and prosper, st_cartesian_point=POINT(4297.11 -1475.53), language_name=English, value=1001, st_version=1.2.3}], newOp [Index{id='1', seqNo=0, primaryTerm=1, version=1, autoGeneratedIdTimestamp=-1} source: {st_logs=2024-12-23T12:15:00.000Z 1.2.3.4 [email protected] 4553, st_double=5.20128E11, st_geopoint=POINT(42.97109630194 14.7552534413725), st_integer=23, st_geoshape=POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10)), description=some description1, st_datetime=1953-09-02T00:00:00.000Z, st_unicode=你吃饭了吗, st_unsigned_long=2147483648, st_bool=false, st_ip=1.1.1.1, st_base64=ZWxhc3RpYw==, st_multi_value=[Hello there!, This is a random value, for testing purposes], st_long=2147483648, host=host1, semantic_text_field=live long and prosper, st_cartesian_point=POINT(4297.11 -1475.53), language_name=English, value=1001, st_version=1.2.3, _inference_fields={st_logs={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_logs=[{start_offset=0, end_offset=57, embeddings={feature_0=9.987686E7, feature_1=9.987686E7, feature_2=9.987686E7, feature_3=9.987686E7, feature_4=9.987686E7}}]}}}, st_geopoint={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geopoint=[{start_offset=0, end_offset=38, embeddings={feature_0=4.8758784E7, feature_1=4.8758784E7, feature_2=4.8758784E7, feature_3=4.8758784E7, feature_4=4.8758784E7}}]}}}, st_double={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_double=[{start_offset=0, end_offset=10, embeddings={feature_0=1.021313E9, feature_1=1.021313E9, feature_2=1.021313E9, feature_3=1.021313E9, feature_4=1.021313E9}}]}}}, st_integer={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_integer=[{start_offset=0, end_offset=2, embeddings={feature_0=1600.0, feature_1=1600.0, feature_2=1604.0, feature_3=1604.0, feature_4=1604.0}}]}}}, st_geoshape={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geoshape=[{start_offset=0, end_offset=45, embeddings={feature_0=2.8940698E8, feature_1=2.8940698E8, feature_2=2.8940698E8, feature_3=2.8940698E8, feature_4=2.8940698E8}}]}}}, st_unicode={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unicode=[{start_offset=0, end_offset=5, embeddings={feature_0=2.0258488E9, feature_1=2.0258488E9, feature_2=2.0258488E9, feature_3=2.0258488E9, feature_4=2.0258488E9}}]}}}, st_datetime={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_datetime=[{start_offset=0, end_offset=24, embeddings={feature_0=1.7909678E9, feature_1=1.7909678E9, feature_2=1.7909678E9, feature_3=1.7909678E9, feature_4=1.7909678E9}}]}}}, st_unsigned_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unsigned_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.5966157E8, feature_1=3.5966157E8, feature_2=3.5966157E8, feature_3=3.5966157E8, feature_4=3.5966157E8}}]}}}, st_bool={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_bool=[{start_offset=0, end_offset=5, embeddings={feature_0=9.699328E7, feature_1=9.699328E7, feature_2=9.699328E7, feature_3=9.699328E7, feature_4=9.699328E7}}]}}}, st_ip={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_ip=[{start_offset=0, end_offset=7, embeddings={feature_0=1.9000197E9, feature_1=1.9000197E9, feature_2=1.9000197E9, feature_3=1.9000197E9, feature_4=1.9000197E9}}]}}}, st_base64={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_base64=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4176748E9, feature_1=1.4176748E9, feature_2=1.4176748E9, feature_3=1.4176748E9, feature_4=1.4176748E9}}]}}}, st_multi_value={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_multi_value=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4522778E8, feature_1=1.4522778E8, feature_2=1.4522778E8, feature_3=1.4522778E8, feature_4=1.4522778E8}}, {start_offset=13, end_offset=35, embeddings={feature_0=8.682209E8, feature_1=8.682209E8, feature_2=8.682209E8, feature_3=8.682209E8, feature_4=8.682209E8}}, {start_offset=36, end_offset=56, embeddings={feature_0=1.5057551E9, feature_1=1.5057551E9, feature_2=1.5057551E9, feature_3=1.5057551E9, feature_4=1.5057551E9}}]}}}, st_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.5966157E8, feature_1=3.5966157E8, feature_2=3.5966157E8, feature_3=3.5966157E8, feature_4=3.5966157E8}}]}}}, semantic_text_field={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={semantic_text_field=[{start_offset=0, end_offset=21, embeddings={feature_0=2.977956E8, feature_1=2.977956E8, feature_2=2.977956E8, feature_3=2.977956E8, feature_4=2.977956E8}}]}}}, st_cartesian_point={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_cartesian_point=[{start_offset=0, end_offset=23, embeddings={feature_0=2.0300431E9, feature_1=2.0300431E9, feature_2=2.0300431E9, feature_3=2.0300431E9, feature_4=2.0300431E9}}]}}}, st_version={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_version=[{start_offset=0, end_offset=5, embeddings={feature_0=4.666163E7, feature_1=4.666163E7, feature_2=4.666163E7, feature_3=4.666163E7, feature_4=4.666163E7}}]}}}}}]
	at [email protected]/org.elasticsearch.index.translog.TranslogWriter.assertNoSeqNumberConflict(TranslogWriter.java:307)
	at [email protected]/org.elasticsearch.index.translog.TranslogWriter.add(TranslogWriter.java:265)
	at [email protected]/org.elasticsearch.index.translog.Translog.add(Translog.java:633)
	at [email protected]/org.elasticsearch.index.engine.InternalEngine.index(InternalEngine.java:1229)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.index(IndexShard.java:1089)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:1015)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.applyTranslogOperation(IndexShard.java:2031)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.applyTranslogOperation(IndexShard.java:2018)
	at [email protected]/org.elasticsearch.indices.recovery.RecoveryTarget.lambda$indexTranslogOperations$4(RecoveryTarget.java:454)
	at [email protected]/org.elasticsearch.action.ActionListener.completeWith(ActionListener.java:356)
	at [email protected]/org.elasticsearch.indices.recovery.RecoveryTarget.indexTranslogOperations(RecoveryTarget.java:429)
	at [email protected]/org.elasticsearch.indices.recovery.PeerRecoveryTargetService$TranslogOperationsRequestHandler.performTranslogOps(PeerRecoveryTargetService.java:649)
	at [email protected]/org.elasticsearch.indices.recovery.PeerRecoveryTargetService$TranslogOperationsRequestHandler.handleRequest(PeerRecoveryTargetService.java:596)
	at [email protected]/org.elasticsearch.indices.recovery.PeerRecoveryTargetService$TranslogOperationsRequestHandler.handleRequest(PeerRecoveryTargetService.java:588)
	at [email protected]/org.elasticsearch.indices.recovery.PeerRecoveryTargetService$RecoveryRequestHandler.messageReceived(PeerRecoveryTargetService.java:682)
	at [email protected]/org.elasticsearch.indices.recovery.PeerRecoveryTargetService$RecoveryRequestHandler.messageReceived(PeerRecoveryTargetService.java:669)
	at [email protected]/org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:90)
	at [email protected]/org.elasticsearch.transport.InboundHandler.doHandleRequest(InboundHandler.java:289)
	at [email protected]/org.elasticsearch.transport.InboundHandler$1.doRun(InboundHandler.java:302)
	at [email protected]/org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:1044)
	at [email protected]/org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:27)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1095)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:619)
	at java.base/java.lang.Thread.run(Thread.java:1447)
Caused by: java.lang.AssertionError: seqNo [0] was processed twice in generation [2], with different data. prvOp [Index{id='1', seqNo=0, primaryTerm=1, version=1, autoGeneratedIdTimestamp=-1} source: {st_logs=2024-12-23T12:15:00.000Z 1.2.3.4 [email protected] 4553, st_double=5.20128E11, st_geopoint=POINT(42.97109630194 14.7552534413725), _inference_fields={st_logs={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_logs=[{start_offset=0, end_offset=57, embeddings={feature_0=9.994094E7, feature_1=9.994094E7, feature_2=9.994094E7, feature_3=9.994095E7, feature_4=9.994095E7}}]}}}, st_geopoint={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geopoint=[{start_offset=0, end_offset=38, embeddings={feature_0=4.8845428E7, feature_1=4.8845428E7, feature_2=4.8845428E7, feature_3=4.884543E7, feature_4=4.884543E7}}]}}}, st_double={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_double=[{start_offset=0, end_offset=10, embeddings={feature_0=1.02139264E9, feature_1=1.02139264E9, feature_2=1.02139264E9, feature_3=1.02139264E9, feature_4=1.02139264E9}}]}}}, st_integer={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_integer=[{start_offset=0, end_offset=2, embeddings={feature_0=1602.0, feature_1=1603.0, feature_2=1604.0, feature_3=1605.0, feature_4=1606.0}}]}}}, st_geoshape={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geoshape=[{start_offset=0, end_offset=45, embeddings={feature_0=2.9025277E8, feature_1=2.9025277E8, feature_2=2.9025277E8, feature_3=2.9025277E8, feature_4=2.9025277E8}}]}}}, st_unicode={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unicode=[{start_offset=0, end_offset=5, embeddings={feature_0=2.0297848E9, feature_1=2.0297848E9, feature_2=2.0297848E9, feature_3=2.0297848E9, feature_4=2.0297848E9}}]}}}, st_datetime={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_datetime=[{start_offset=0, end_offset=24, embeddings={feature_0=1.7914711E9, feature_1=1.7914711E9, feature_2=1.7914711E9, feature_3=1.7914711E9, feature_4=1.7914711E9}}]}}}, st_unsigned_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unsigned_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.597711E8, feature_1=3.597711E8, feature_2=3.597711E8, feature_3=3.597711E8, feature_4=3.597711E8}}]}}}, st_bool={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_bool=[{start_offset=0, end_offset=5, embeddings={feature_0=9.719632E7, feature_1=9.719633E7, feature_2=9.719633E7, feature_3=9.719633E7, feature_4=9.719633E7}}]}}}, st_ip={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_ip=[{start_offset=0, end_offset=7, embeddings={feature_0=1.9016198E9, feature_1=1.9016198E9, feature_2=1.9016198E9, feature_3=1.9016198E9, feature_4=1.9016198E9}}]}}}, st_base64={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_base64=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4194486E9, feature_1=1.4194486E9, feature_2=1.4194486E9, feature_3=1.4194486E9, feature_4=1.4194486E9}}]}}}, st_multi_value={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_multi_value=[{start_offset=0, end_offset=12, embeddings={feature_0=1.457005E8, feature_1=1.457005E8, feature_2=1.4570051E8, feature_3=1.4570051E8, feature_4=1.4570051E8}}, {start_offset=13, end_offset=35, embeddings={feature_0=8.689444E8, feature_1=8.689444E8, feature_2=8.689444E8, feature_3=8.689444E8, feature_4=8.689444E8}}, {start_offset=36, end_offset=56, embeddings={feature_0=1.5083834E9, feature_1=1.5083834E9, feature_2=1.5083834E9, feature_3=1.5083834E9, feature_4=1.5083834E9}}]}}}, st_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.597711E8, feature_1=3.597711E8, feature_2=3.597711E8, feature_3=3.597711E8, feature_4=3.597711E8}}]}}}, semantic_text_field={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={semantic_text_field=[{start_offset=0, end_offset=21, embeddings={feature_0=2.9864544E8, feature_1=2.9864544E8, feature_2=2.9864544E8, feature_3=2.9864544E8, feature_4=2.9864544E8}}]}}}, st_cartesian_point={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_cartesian_point=[{start_offset=0, end_offset=23, embeddings={feature_0=2.032471E9, feature_1=2.032471E9, feature_2=2.032471E9, feature_3=2.032471E9, feature_4=2.032471E9}}]}}}, st_version={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_version=[{start_offset=0, end_offset=5, embeddings={feature_0=4.6672444E7, feature_1=4.6672444E7, feature_2=4.6672444E7, feature_3=4.667245E7, feature_4=4.667245E7}}]}}}}, st_integer=23, st_geoshape=POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10)), description=some description1, st_datetime=1953-09-02T00:00:00.000Z, st_unicode=你吃饭了吗, st_unsigned_long=2147483648, st_bool=false, st_ip=1.1.1.1, st_base64=ZWxhc3RpYw==, st_multi_value=[Hello there!, This is a random value, for testing purposes], st_long=2147483648, host=host1, semantic_text_field=live long and prosper, st_cartesian_point=POINT(4297.11 -1475.53), language_name=English, value=1001, st_version=1.2.3}], newOp [Index{id='1', seqNo=0, primaryTerm=1, version=1, autoGeneratedIdTimestamp=-1} source: {st_logs=2024-12-23T12:15:00.000Z 1.2.3.4 [email protected] 4553, st_double=5.20128E11, st_geopoint=POINT(42.97109630194 14.7552534413725), st_integer=23, st_geoshape=POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10)), description=some description1, st_datetime=1953-09-02T00:00:00.000Z, st_unicode=你吃饭了吗, st_unsigned_long=2147483648, st_bool=false, st_ip=1.1.1.1, st_base64=ZWxhc3RpYw==, st_multi_value=[Hello there!, This is a random value, for testing purposes], st_long=2147483648, host=host1, semantic_text_field=live long and prosper, st_cartesian_point=POINT(4297.11 -1475.53), language_name=English, value=1001, st_version=1.2.3, _inference_fields={st_logs={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_logs=[{start_offset=0, end_offset=57, embeddings={feature_0=9.987686E7, feature_1=9.987686E7, feature_2=9.987686E7, feature_3=9.987686E7, feature_4=9.987686E7}}]}}}, st_geopoint={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geopoint=[{start_offset=0, end_offset=38, embeddings={feature_0=4.8758784E7, feature_1=4.8758784E7, feature_2=4.8758784E7, feature_3=4.8758784E7, feature_4=4.8758784E7}}]}}}, st_double={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_double=[{start_offset=0, end_offset=10, embeddings={feature_0=1.021313E9, feature_1=1.021313E9, feature_2=1.021313E9, feature_3=1.021313E9, feature_4=1.021313E9}}]}}}, st_integer={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_integer=[{start_offset=0, end_offset=2, embeddings={feature_0=1600.0, feature_1=1600.0, feature_2=1604.0, feature_3=1604.0, feature_4=1604.0}}]}}}, st_geoshape={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_geoshape=[{start_offset=0, end_offset=45, embeddings={feature_0=2.8940698E8, feature_1=2.8940698E8, feature_2=2.8940698E8, feature_3=2.8940698E8, feature_4=2.8940698E8}}]}}}, st_unicode={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unicode=[{start_offset=0, end_offset=5, embeddings={feature_0=2.0258488E9, feature_1=2.0258488E9, feature_2=2.0258488E9, feature_3=2.0258488E9, feature_4=2.0258488E9}}]}}}, st_datetime={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_datetime=[{start_offset=0, end_offset=24, embeddings={feature_0=1.7909678E9, feature_1=1.7909678E9, feature_2=1.7909678E9, feature_3=1.7909678E9, feature_4=1.7909678E9}}]}}}, st_unsigned_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_unsigned_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.5966157E8, feature_1=3.5966157E8, feature_2=3.5966157E8, feature_3=3.5966157E8, feature_4=3.5966157E8}}]}}}, st_bool={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_bool=[{start_offset=0, end_offset=5, embeddings={feature_0=9.699328E7, feature_1=9.699328E7, feature_2=9.699328E7, feature_3=9.699328E7, feature_4=9.699328E7}}]}}}, st_ip={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_ip=[{start_offset=0, end_offset=7, embeddings={feature_0=1.9000197E9, feature_1=1.9000197E9, feature_2=1.9000197E9, feature_3=1.9000197E9, feature_4=1.9000197E9}}]}}}, st_base64={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_base64=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4176748E9, feature_1=1.4176748E9, feature_2=1.4176748E9, feature_3=1.4176748E9, feature_4=1.4176748E9}}]}}}, st_multi_value={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_multi_value=[{start_offset=0, end_offset=12, embeddings={feature_0=1.4522778E8, feature_1=1.4522778E8, feature_2=1.4522778E8, feature_3=1.4522778E8, feature_4=1.4522778E8}}, {start_offset=13, end_offset=35, embeddings={feature_0=8.682209E8, feature_1=8.682209E8, feature_2=8.682209E8, feature_3=8.682209E8, feature_4=8.682209E8}}, {start_offset=36, end_offset=56, embeddings={feature_0=1.5057551E9, feature_1=1.5057551E9, feature_2=1.5057551E9, feature_3=1.5057551E9, feature_4=1.5057551E9}}]}}}, st_long={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_long=[{start_offset=0, end_offset=10, embeddings={feature_0=3.5966157E8, feature_1=3.5966157E8, feature_2=3.5966157E8, feature_3=3.5966157E8, feature_4=3.5966157E8}}]}}}, semantic_text_field={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={semantic_text_field=[{start_offset=0, end_offset=21, embeddings={feature_0=2.977956E8, feature_1=2.977956E8, feature_2=2.977956E8, feature_3=2.977956E8, feature_4=2.977956E8}}]}}}, st_cartesian_point={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_cartesian_point=[{start_offset=0, end_offset=23, embeddings={feature_0=2.0300431E9, feature_1=2.0300431E9, feature_2=2.0300431E9, feature_3=2.0300431E9, feature_4=2.0300431E9}}]}}}, st_version={inference={inference_id=test_sparse_inference, model_settings={task_type=sparse_embedding}, chunks={st_version=[{start_offset=0, end_offset=5, embeddings={feature_0=4.666163E7, feature_1=4.666163E7, feature_2=4.666163E7, feature_3=4.666163E7, feature_4=4.666163E7}}]}}}}}]
	... 24 more
Caused by: java.lang.RuntimeException: stack capture previous op
	at [email protected]/org.elasticsearch.index.translog.TranslogWriter.assertNoSeqNumberConflict(TranslogWriter.java:313)
	at [email protected]/org.elasticsearch.index.translog.TranslogWriter.add(TranslogWriter.java:265)
	at [email protected]/org.elasticsearch.index.translog.Translog.add(Translog.java:633)
	at [email protected]/org.elasticsearch.index.engine.InternalEngine.index(InternalEngine.java:1229)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.index(IndexShard.java:1089)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:1015)
	at [email protected]/org.elasticsearch.index.shard.IndexShard.applyIndexOperationOnReplica(IndexShard.java:956)
	at [email protected]/org.elasticsearch.action.bulk.TransportShardBulkAction.performOpOnReplica(TransportShardBulkAction.java:713)
	at [email protected]/org.elasticsearch.action.bulk.TransportShardBulkAction.performOnReplica(TransportShardBulkAction.java:690)
	at [email protected]/org.elasticsearch.action.bulk.TransportShardBulkAction.lambda$dispatchedShardOperationOnReplica$7(TransportShardBulkAction.java:646)
	at [email protected]/org.elasticsearch.action.ActionListener.completeWith(ActionListener.java:356)
	at [email protected]/org.elasticsearch.action.bulk.TransportShardBulkAction.dispatchedShardOperationOnReplica(TransportShardBulkAction.java:644)
	at [email protected]/org.elasticsearch.action.bulk.TransportShardBulkAction.dispatchedShardOperationOnReplica(TransportShardBulkAction.java:83)
	at [email protected]/org.elasticsearch.action.support.replication.TransportWriteAction$2.doRun(TransportWriteAction.java:248)
	at [email protected]/org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:27)
	at [email protected]/org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:34)
	... 5 more

@dnhatn dnhatn added medium-risk An open issue or test failure that is a medium risk to future releases and removed needs:risk Requires assignment of a risk label (low, medium, blocker) labels Mar 18, 2025
@jimczi jimczi assigned jimczi and unassigned ioanatia Mar 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:Analytics/ES|QL AKA ESQL medium-risk An open issue or test failure that is a medium risk to future releases Team:Analytics Meta label for analytical engine team (ESQL/Aggs/Geo) >test-failure Triaged test failures from CI
Projects
None yet
Development

No branches or pull requests

4 participants