Any experiences with a self-hosted assistant like the modern Google Assistant? Looking for something LLM-powered that is smarter than older assistants that would just try to call 3rd party tools directly and miss or misunderstand requests half of the time.

I’d like integration with a mobile app to use it from the phone and while driving. I see Home Assistant has an Android Auto integration. Has anyone used this, or another similar option? Any blatant limitations?

  • hendrik
    link
    fedilink
    English
    2
    edit-2
    1 hour ago

    Livekit can be used to build voice assistants. But it’s more a framework to build an agent yourself, not a ready-made solution.

  • @wildbus8979@sh.itjust.works
    link
    fedilink
    English
    125 hours ago

    Home Assistant can absolutely do that. If you are ok with simple intent based phrasing it’ll do it out of the box. If you want complex understanding and reasoning you’ll have to run a local LLM, like Llama, on top of it

  • Jul (they/she)
    link
    fedilink
    English
    34 hours ago

    You have to run an LLM of your own and link it, if you want quality even close to approaching Google, but the Home Assistant with the Nabu Casa “Home Assistant Voice Preview Edition” speakers are working well enough for me. I don’t use it for much beyond controlling my home automation components, though. But it’s still very early tech anf it doesn’t understand all that much unless you add a lot of your own configurations. I eventually plan to add an LLM, but even just running on the home assistant yellow hardware with a raspberry pi compute module 5 works ok for the basics though there is a slight delay.

    I haven’t tried, but Nabu Casa also offers a subscription service for the voice processing if you want something more robust and can’t host your own LLM, but thst means sending your data out, even if they have good privacy policies, which I’m not interested in, because while I somewhat trust Nabu Casa’s current business model and policies, being hosted in the US means it’s susceptible to the current regime’s police-state policies. I’m waiting for hardware costs to recover from the AI bubble to self host an LLM, personally.

  • @penguinA
    link
    English
    45 hours ago

    Home Assistant can do that, the quality will really depend on what hardware you have to run the LLM. If you only have a CPU you’ll be waiting 20 seconds for a response, which could also be pretty poor if you have to run a small quantized model

  • artyom
    link
    fedilink
    English
    26 hours ago

    Multi-billion dollar companies like Google and Apple can’t even figure this shit out, doubt some nerd is gonna do it for free.

    • Eager EagleOP
      link
      fedilink
      English
      65 hours ago
      1. They can and do; 2. LLMs can do tool calling just fine, even self-hosted ones.
      • artyom
        link
        fedilink
        English
        15 hours ago

        LOL they can’t even reliably turn the lights on, WTF are you talking about?

        • Eager EagleOP
          link
          fedilink
          English
          24 hours ago

          maybe last you tried it was over 6 months ago, maybe you’re using the old google assistant, or idk, but it definitely works for me

          • artyom
            link
            fedilink
            English
            24 hours ago

            Everything I’ve read says Gemini is like 10x worse than Google Assistant.

  • James R Kirk
    link
    fedilink
    English
    15 hours ago

    Maybe things have improved but the last time I tried the Home Assistant er- assistant, it was garbage at anything other than the most basic commands given perfectly.