LocalAI/backend
dependabot[bot] 1dfc52de16
chore(deps): Bump intel-extension-for-pytorch from 2.3.110+xpu to 2.6.10+xpu in /backend/python/diffusers (#4973)
chore(deps): Bump intel-extension-for-pytorch

Bumps intel-extension-for-pytorch from 2.3.110+xpu to 2.6.10+xpu.

---
updated-dependencies:
- dependency-name: intel-extension-for-pytorch
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-10 21:14:43 +00:00
..
cpp fix(llama.cpp): correctly handle embeddings in batches (#4957) 2025-03-07 19:29:52 +01:00
go chore(stable-diffusion-ggml): update, adapt upstream changes (#4889) 2025-02-23 08:36:41 +01:00
python chore(deps): Bump intel-extension-for-pytorch from 2.3.110+xpu to 2.6.10+xpu in /backend/python/diffusers (#4973) 2025-03-10 21:14:43 +00:00
backend.proto chore(deps): update llama.cpp and sync with upstream changes (#4950) 2025-03-06 00:40:58 +01:00