Add flashinfer && Oultines (#1)
This commit is contained in:
1
3rdparty/flashinfer
vendored
Submodule
1
3rdparty/flashinfer
vendored
Submodule
Submodule 3rdparty/flashinfer added at 00cf5f46fd
@@ -164,4 +164,4 @@ python -m sglang.launch_server --model-path meta-llama/Llama-2-7b-chat-hf --port
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
We learned from the design and reused some code of the following projects: [Guidance](https://github.com/guidance-ai/guidance), [vLLM](https://github.com/vllm-project/vllm), [LightLLM](https://github.com/ModelTC/lightllm), [FlashInfer](https://github.com/flashinfer-ai/flashinfer), [LMQL](https://github.com/eth-sri/lmql).
|
We learned from the design and reused some code of the following projects: [Guidance](https://github.com/guidance-ai/guidance), [vLLM](https://github.com/vllm-project/vllm), [LightLLM](https://github.com/ModelTC/lightllm), [FlashInfer](https://github.com/flashinfer-ai/flashinfer), [Outlines](https://github.com/outlines-dev/outlines), [LMQL](https://github.com/eth-sri/lmql).
|
||||||
|
|||||||
Reference in New Issue
Block a user