Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Import failed with error alloc id failed when there are 100 tasks #37838

Open
1 task done
zhuwenxing opened this issue Nov 20, 2024 · 2 comments
Open
1 task done
Assignees
Labels
kind/bug Issues or changes related a bug triage/accepted Indicates an issue or PR is ready to be actively worked on.
Milestone

Comments

@zhuwenxing
Copy link
Contributor

zhuwenxing commented Nov 20, 2024

Is there an existing issue for this?

  • I have searched the existing issues

Environment

- Milvus version:master-20241119-fceff6ed-amd64
- Deployment mode(standalone or cluster):standalone
- MQ type(rocksmq, pulsar or kafka):    
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior

2024-11-20 06:11:01.354 | INFO     | __main__:prepare_data:48 - collection test_full_text_search_perf created
 RPC error: [do_bulk_insert], <MilvusException: (code=2100, message=alloc id failed, err=%wstack trace: /workspace/source/pkg/tracer/stack_trace.go:51 github.com/milvus-io/milvus/pkg/tracer.StackTrace
 /workspace/source/internal/util/grpcclient/client.go:555 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
 /workspace/source/internal/util/grpcclient/client.go:569 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).ReCall
 /workspace/source/internal/distributed/rootcoord/client/client.go:107 github.com/milvus-io/milvus/internal/distributed/rootcoord/client.wrapGrpcCall[...]
 /workspace/source/internal/distributed/rootcoord/client/client.go:320 github.com/milvus-io/milvus/internal/distributed/rootcoord/client.(*Client).AllocID
 /workspace/source/internal/datacoord/allocator/allocator.go:94 github.com/milvus-io/milvus/internal/datacoord/allocator.(*rootCoordAllocator).AllocN
 /workspace/source/internal/datacoord/services.go:1712 github.com/milvus-io/milvus/internal/datacoord.(*Server).ImportV2
 /workspace/source/internal/distributed/datacoord/service.go:504 github.com/milvus-io/milvus/internal/distributed/datacoord.(*Server).ImportV2
 /workspace/source/internal/proto/datapb/data_coord_grpc.pb.go:1653 github.com/milvus-io/milvus/internal/proto/datapb._DataCoord_ImportV2_Handler.func1
 /workspace/source/internal/util/streamingutil/service/interceptor/server.go:16 github.com/milvus-io/milvus/internal/distributed/datacoord.(*Server).startGrpcLoop.NewStreamingServiceUnaryServerInterceptor.func8: context deadline exceeded: importing data failed)>, <Time:{'RPC start': '2024-11-20 06:11:01.354690', 'RPC error': '2024-11-20 06:11:12.417058'}>
 Traceback (most recent call last):
   File "prepare_data.py", line 102, in <module>
     prepare_data(host=args.host, port=args.port, data_size=args.data_size)
   File "prepare_data.py", line 51, in prepare_data
     task_id = utility.do_bulk_insert(collection_name=collection_name, files=[files])
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/orm/utility.py", line 841, in do_bulk_insert
     return _get_connection(using).do_bulk_insert(
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 141, in handler
     raise e from e
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 137, in handler
     return func(*args, **kwargs)
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 176, in handler
     return func(self, *args, **kwargs)
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 116, in handler
     raise e from e
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 86, in handler
     return func(*args, **kwargs)
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 1699, in do_bulk_insert
     check_status(response.status)
   File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/utils.py", line 63, in check_status
     raise MilvusException(status.code, status.reason, status.error_code)
 pymilvus.exceptions.MilvusException: <MilvusException: (code=2100, message=alloc id failed, err=%wstack trace: /workspace/source/pkg/tracer/stack_trace.go:51 github.com/milvus-io/milvus/pkg/tracer.StackTrace
 /workspace/source/internal/util/grpcclient/client.go:555 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
 /workspace/source/internal/util/grpcclient/client.go:569 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).ReCall
 /workspace/source/internal/distributed/rootcoord/client/client.go:107 github.com/milvus-io/milvus/internal/distributed/rootcoord/client.wrapGrpcCall[...]
 /workspace/source/internal/distributed/rootcoord/client/client.go:320 github.com/milvus-io/milvus/internal/distributed/rootcoord/client.(*Client).AllocID
 /workspace/source/internal/datacoord/allocator/allocator.go:94 github.com/milvus-io/milvus/internal/datacoord/allocator.(*rootCoordAllocator).AllocN
 /workspace/source/internal/datacoord/services.go:1712 github.com/milvus-io/milvus/internal/datacoord.(*Server).ImportV2
 /workspace/source/internal/distributed/datacoord/service.go:504 github.com/milvus-io/milvus/internal/distributed/datacoord.(*Server).ImportV2
 /workspace/source/internal/proto/datapb/data_coord_grpc.pb.go:1653 github.com/milvus-io/milvus/internal/proto/datapb._DataCoord_ImportV2_Handler.func1
 /workspace/source/internal/util/streamingutil/service/interceptor/server.go:16 github.com/milvus-io/milvus/internal/distributed/datacoord.(*Server).startGrpcLoop.NewStreamingServiceUnaryServerInterceptor.func8: context deadline exceeded: importing data failed)>

Expected Behavior

No response

Steps To Reproduce

No response

Milvus Log

log.log

Anything else?

❯ k get pod -o wide|grep fts-performance                            
fts-performance-etcd-0                                            1/1     Running            0                18h     10.104.16.36    4am-node21   <none>           <none>
fts-performance-kafka-0                                           2/2     Running            1 (18h ago)      18h     10.104.16.38    4am-node21   <none>           <none>
fts-performance-kafka-1                                           2/2     Running            0                18h     10.104.15.141   4am-node20   <none>           <none>
fts-performance-kafka-2                                           2/2     Running            0                18h     10.104.20.16    4am-node22   <none>           <none>
fts-performance-kafka-exporter-9fbc48654-vb8xs                    1/1     Running            4 (18h ago)      18h     10.104.16.33    4am-node21   <none>           <none>
fts-performance-kafka-zookeeper-0                                 1/1     Running            0                18h     10.104.16.39    4am-node21   <none>           <none>
fts-performance-kafka-zookeeper-1                                 1/1     Running            0                18h     10.104.21.50    4am-node24   <none>           <none>
fts-performance-kafka-zookeeper-2                                 1/1     Running            0                18h     10.104.15.142   4am-node20   <none>           <none>
fts-performance-milvus-standalone-9b6c6749d-k4zwz                 1/1     Running            0                18h     10.104.12.78    4am-node17   <none>           <none>
fts-performance-minio-0                                           1/1     Running            0                18h     10.104.16.37    4am-node21   <none>           <none>
fts-performance-minio-1                                           1/1     Running            0                18h     10.104.21.49    4am-node24   <none>           <none>
fts-performance-minio-2                                           1/1     Running            0                18h     10.104.15.140   4am-node20   <none>           <none>
fts-performance-minio-3 
@zhuwenxing zhuwenxing added kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Nov 20, 2024
@yanliang567
Copy link
Contributor

/assign @czs007
/unassign

@sre-ci-robot sre-ci-robot assigned czs007 and unassigned yanliang567 Nov 20, 2024
@yanliang567 yanliang567 added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Nov 20, 2024
@yanliang567 yanliang567 added this to the 2.5.0 milestone Nov 20, 2024
@czs007 czs007 assigned xiaocai2333 and unassigned czs007 Nov 21, 2024
@congqixia
Copy link
Contributor

The error happened before the actual Import task begin.
When datacoord tried to allocate IDs for import task from rootcoord, it failed due to timeout.

// Allocate file ids.
idStart, _, err := s.allocator.AllocN(int64(len(files)) + 1)
if err != nil {
resp.Status = merr.Status(merr.WrapErrImportFailed(fmt.Sprint("alloc id failed, err=%w", err)))
return resp, nil
}

it looks like the metrics has no problem and could be a transient env problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Issues or changes related a bug triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

5 participants