This might also be an automatic response to prevent discussion. Although I’m not sure since it’s MS’ AI.

  • plz1
    link
    fedilink
    English
    171 year ago

    I get Copilot to bail on conversations so often like your example that I’m only using it for help with programming/code snippets at this point. The moment you question accuracy, bam, chat’s over.

    I asked if there was a Copilot extension for VS Code, and it said yup, talked about how to install it, and even configure it. That was completely fabricated, and as soon as I asked for more detail to prove it was real, chat’s over.

    • That would force them to reveal it’s sources (unconsented scraping) hence make them liable for any potential lawsuits. As such they would need to withdraw from revealing sources