Voice Logging Comments (iOS)

Apologies if this is not the right way to submit my semi-professional complaining, new to the beta.

Here are some first impressions of the voice log. Firstly, I love this idea. I have meathooks for hands and don't use predictive text, so I use almost exclusively voice or PC keyboards for entry. This is very much in my wheelhouse. I'm an appsec and pentester by day anyway.

  1. Voice logging seems to use Apple's really context-irrelevant version of voice-to-text, or something similarly backward. If there's a way that you could connect to, I guess, ChatGPT or Whisper, or I don't know, whichever one ChatGPT is using...it is excellent compared to Apple with context.

  2. If VoiceLogging hears the wrong word or lists a homonym, it doesn't seem like it wants to give me the apple keyboard to allow me to correct the entry.

  3. I'm seeing numerous times where it says that it's listening and the button is highlighted, but it's actually not listening and not writing anything.

  4. It would be a little more convenient if the beta voice logging button was not in the exact same place where the manual add was before, if it was put off to the side.

  5. Obviously, this is beta, but it's presenting an orange error screen at the bottom instead of saying no match or item not found.

  6. Would be good for workflow if the voice matched query were to send automatically after a pause, so that the user would not have to click the button a second time to stop speaking...but it would instead switch off automatically, based on either a pause in speech, or based on the returned results from the database, or "no results found"