|
2024-07-02 09:47:41 | INFO | model_worker | args: Namespace(awq_ckpt=None, awq_groupsize=-1, awq_wbits=16, controller_address='http://127.0.0.1:21002', conv_template=None, cpu_offloading=False, debug=False, device='cuda', dtype=None, embed_in_truncate=False, enable_exllama=False, enable_xft=False, exllama_cache_8bit=False, exllama_gpu_split=None, exllama_max_seq_len=4096, gptq_act_order=False, gptq_ckpt=None, gptq_groupsize=-1, gptq_wbits=16, gpus=None, host='127.0.0.1', limit_worker_concurrency=5, load_8bit=False, max_gpu_memory=None, model_names=None, model_path='lmsys/vicuna-7b-v1.5', no_register=False, num_gpus=1, port=21003, revision='main', seed=None, ssl=False, stream_interval=2, worker_address='http://127.0.0.1:21003', xft_dtype=None, xft_max_seq_len=4096) |
|
2024-07-02 09:47:41 | INFO | model_worker | Loading the model ['vicuna-7b-v1.5'] on worker 9de16774 ... |
|
2024-07-02 09:47:41 | ERROR | stderr | /usr/local/lib/python3.8/dist-packages/torch/storage.py:315: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. |
|
2024-07-02 09:47:41 | ERROR | stderr | warnings.warn(message, UserWarning) |
|
2024-07-02 09:47:41 | ERROR | stderr |
Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] |
|
2024-07-02 09:47:46 | ERROR | stderr |
Loading checkpoint shards: 50%|ββββββββββββββββββββββββββββ | 1/2 [00:04<00:04, 4.73s/it] |
|
2024-07-02 09:47:48 | ERROR | stderr |
Loading checkpoint shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 [00:06<00:00, 3.17s/it] |
|
2024-07-02 09:47:48 | ERROR | stderr |
Loading checkpoint shards: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 [00:06<00:00, 3.41s/it] |
|
2024-07-02 09:47:48 | ERROR | stderr | |
|
2024-07-02 09:47:48 | ERROR | stderr | /usr/local/lib/python3.8/dist-packages/transformers/generation/configuration_utils.py:540: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0.9` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`. This was detected when initializing the generation config instance, which means the corresponding file may hold incorrect parameterization and should be fixed. |
|
2024-07-02 09:47:48 | ERROR | stderr | warnings.warn( |
|
2024-07-02 09:47:48 | ERROR | stderr | /usr/local/lib/python3.8/dist-packages/transformers/generation/configuration_utils.py:545: UserWarning: `do_sample` is set to `False`. However, `top_p` is set to `0.6` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`. This was detected when initializing the generation config instance, which means the corresponding file may hold incorrect parameterization and should be fixed. |
|
2024-07-02 09:47:48 | ERROR | stderr | warnings.warn( |
|
2024-07-02 09:47:48 | ERROR | stderr | /usr/local/lib/python3.8/dist-packages/transformers/generation/configuration_utils.py:540: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0.9` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`. |
|
2024-07-02 09:47:48 | ERROR | stderr | warnings.warn( |
|
2024-07-02 09:47:48 | ERROR | stderr | /usr/local/lib/python3.8/dist-packages/transformers/generation/configuration_utils.py:545: UserWarning: `do_sample` is set to `False`. However, `top_p` is set to `0.6` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`. |
|
2024-07-02 09:47:48 | ERROR | stderr | warnings.warn( |
|
2024-07-02 09:47:52 | INFO | model_worker | Register to controller |
|
2024-07-02 09:47:52 | ERROR | stderr | [32mINFO[0m: Started server process [[36m51055[0m] |
|
2024-07-02 09:47:52 | ERROR | stderr | [32mINFO[0m: Waiting for application startup. |
|
2024-07-02 09:47:52 | ERROR | stderr | [32mINFO[0m: Application startup complete. |
|
2024-07-02 09:47:52 | ERROR | stderr | [32mINFO[0m: Uvicorn running on [1mhttp://127.0.0.1:21003[0m (Press CTRL+C to quit) |
|
2024-07-02 09:48:37 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: None. call_ct: 0. worker_id: 9de16774. |
|
2024-07-02 09:49:22 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: None. call_ct: 0. worker_id: 9de16774. |
|
2024-07-02 09:49:42 | INFO | stdout | [32mINFO[0m: 127.0.0.1:36282 - "[1mPOST /worker_get_status HTTP/1.1[0m" [32m200 OK[0m |
|
2024-07-02 09:50:07 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: None. call_ct: 0. worker_id: 9de16774. |
|
2024-07-02 09:50:46 | INFO | stdout | [32mINFO[0m: 127.0.0.1:45944 - "[1mPOST /worker_generate_stream HTTP/1.1[0m" [32m200 OK[0m |
|
2024-07-02 09:50:52 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 1. worker_id: 9de16774. |
|
2024-07-02 09:50:56 | INFO | stdout | [32mINFO[0m: 127.0.0.1:53530 - "[1mPOST /worker_generate_stream HTTP/1.1[0m" [32m200 OK[0m |
|
2024-07-02 09:51:37 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 2. worker_id: 9de16774. |
|
2024-07-02 09:52:22 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 2. worker_id: 9de16774. |
|
2024-07-02 09:53:07 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 2. worker_id: 9de16774. |
|
2024-07-02 09:53:52 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 2. worker_id: 9de16774. |
|
2024-07-02 09:54:37 | INFO | model_worker | Send heart beat. Models: ['vicuna-7b-v1.5']. Semaphore: Semaphore(value=5, locked=False). call_ct: 2. worker_id: 9de16774. |
|
2024-07-02 09:54:50 | ERROR | stderr | [32mINFO[0m: Shutting down |
|
2024-07-02 09:54:51 | ERROR | stderr | [32mINFO[0m: Waiting for application shutdown. |
|
2024-07-02 09:54:51 | ERROR | stderr | [32mINFO[0m: Application shutdown complete. |
|
2024-07-02 09:54:51 | ERROR | stderr | [32mINFO[0m: Finished server process [[36m51055[0m] |
|
2024-07-02 09:54:51 | ERROR | stderr | Traceback (most recent call last): |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main |
|
2024-07-02 09:54:51 | ERROR | stderr | return _run_code(code, main_globals, None, |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/lib/python3.8/runpy.py", line 87, in _run_code |
|
2024-07-02 09:54:51 | ERROR | stderr | exec(code, run_globals) |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/LLM_32T/evelyn/FastChat/fastchat/serve/model_worker.py", line 425, in <module> |
|
2024-07-02 09:54:51 | ERROR | stderr | uvicorn.run(app, host=args.host, port=args.port, log_level="info") |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/local/lib/python3.8/dist-packages/uvicorn/main.py", line 577, in run |
|
2024-07-02 09:54:51 | ERROR | stderr | server.run() |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/local/lib/python3.8/dist-packages/uvicorn/server.py", line 65, in run |
|
2024-07-02 09:54:51 | ERROR | stderr | return asyncio.run(self.serve(sockets=sockets)) |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/lib/python3.8/asyncio/runners.py", line 44, in run |
|
2024-07-02 09:54:51 | ERROR | stderr | return loop.run_until_complete(main) |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/loop.pyx", line 1511, in uvloop.loop.Loop.run_until_complete |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/loop.pyx", line 1504, in uvloop.loop.Loop.run_until_complete |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/loop.pyx", line 1377, in uvloop.loop.Loop.run_forever |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/loop.pyx", line 555, in uvloop.loop.Loop._run |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/loop.pyx", line 474, in uvloop.loop.Loop._on_idle |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 83, in uvloop.loop.Handle._run |
|
2024-07-02 09:54:51 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 63, in uvloop.loop.Handle._run |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/local/lib/python3.8/dist-packages/uvicorn/server.py", line 69, in serve |
|
2024-07-02 09:54:51 | ERROR | stderr | await self._serve(sockets) |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/lib/python3.8/contextlib.py", line 120, in __exit__ |
|
2024-07-02 09:54:51 | ERROR | stderr | next(self.gen) |
|
2024-07-02 09:54:51 | ERROR | stderr | File "/usr/local/lib/python3.8/dist-packages/uvicorn/server.py", line 328, in capture_signals |
|
2024-07-02 09:54:51 | ERROR | stderr | signal.raise_signal(captured_signal) |
|
2024-07-02 09:54:51 | ERROR | stderr | KeyboardInterrupt |
|
|