This website requires JavaScript.
Explore
Help
Register
Sign in
dearwolf
/
LocalAI
Watch
1
Star
0
Fork
You've already forked LocalAI
0
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-06-30 06:30:43 +00:00
Code
Issues
Projects
Releases
Packages
Wiki
Activity
f98775ef5c
LocalAI
/
pkg
/
grpc
History
Download ZIP
Download TAR.GZ
mintyleaf
2931ea422d
Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame
2024-11-28 02:25:07 +04:00
..
base
feat(silero): add Silero-vad backend (
#4204
)
2024-11-20 14:48:40 +01:00
backend.go
Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame
2024-11-28 02:25:07 +04:00
client.go
Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame
2024-11-28 02:25:07 +04:00
embed.go
Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame
2024-11-28 02:25:07 +04:00
interface.go
feat(silero): add Silero-vad backend (
#4204
)
2024-11-20 14:48:40 +01:00
server.go
feat(silero): add Silero-vad backend (
#4204
)
2024-11-20 14:48:40 +01:00