Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Encoding Error: 'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128) #576

Open
SamuelDevdas opened this issue Sep 18, 2024 · 0 comments

Comments

@SamuelDevdas
Copy link

Description

When attempting to use the llm command-line interface (CLI) on Windows, I encounter an encoding error related to the ASCII codec. Despite setting environment variables to enforce UTF-8 encoding, the issue persists. Additionally, configuring models results in errors indicating that certain models are unknown or not recognized.


Steps to Reproduce

  1. Set Up the Environment:

    • Operating System: Windows 10
    • PowerShell Version:
      Name : ConsoleHost
      Version : 5.1.22621.4111
      InstanceId : e0d61ad7-8a35-4d6f-9199-74f95ce5a7e9
      UI : System.Management.Automation.Internal.Host.InternalHostUserInterface
      CurrentCulture : en-US
      CurrentUICulture : en-GB
      PrivateData : Microsoft.PowerShell.ConsoleHost+ConsoleColorProxy
      DebuggerEnabled : True
      IsRunspacePushed : False
      Runspace : System.Management.Automation.Runspaces.LocalRunspace
    • Python Version: 3.11
    • Poetry Version: Poetry (version 1.8.2)
    • llm Version: 0.16
  2. Configure Environment Variables:

    $Env:PYTHONIOENCODING = "utf-8"
    $Env:PYTHONUTF8 = "1"
  3. Activate Poetry Virtual Environment:

    • Navigate to a project directory (ensure it's not within site-packages):

      mkdir C:\Users\samue\llm-project
      cd C:\Users\samue\llm-project
      poetry init --no-interaction
      poetry add llm
      poetry shell
  4. Run the llm Command with ASCII Input:

    llm 'hi'

    Expected Behavior: The command should execute without encoding errors and return a response from the language model.

    Actual Behavior:

    Error: 'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128)
    
  5. Run the llm Command with Non-ASCII Input:

    llm 'Café'

    Actual Behavior:

    Error: 'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128)
    
  6. Attempt to Add and Set a Model:

    llm models add groq-llama3.1-70b --path "C:\Path\To\groq-llama3.1-70b"
    llm models set default groq-llama3.1-70b

    Actual Behavior:

    Error: Unknown model: groq-llama3.1-70b
    
  7. List Available Models:

    llm models list

    Actual Output:

    Available models:
    - groq-llama3.1-70b
    
  8. Attempt to Use the Default Model:

    llm 'hi'

    Actual Behavior:

    Error: 'groq-llama3.1-70b' is not a known model
    

Expected Behavior

  • The llm command should handle both ASCII and non-ASCII inputs without encountering encoding errors.
  • Adding and setting models should recognize and configure valid models without errors.

Actual Behavior

  • Encountering an encoding error when using non-ASCII characters, specifically:

    Error: 'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128)
    
  • Errors related to unrecognized models when attempting to add or set a default model:

    Error: 'groq-llama3.1-70b' is not a known model
    

Environment

  • Operating System: Windows 10
  • PowerShell Version: [
    Name : ConsoleHost
    Version : 5.1.22621.4111
    InstanceId : e0d61ad7-8a35-4d6f-9199-74f95ce5a7e9
    UI : System.Management.Automation.Internal.Host.InternalHostUserInterface
    CurrentCulture : en-US
    CurrentUICulture : en-GB
    PrivateData : Microsoft.PowerShell.ConsoleHost+ConsoleColorProxy
    DebuggerEnabled : True
    IsRunspacePushed : False
    Runspace : System.Management.Automation.Runspaces.LocalRunspace]
  • Python Version: 3.11
  • Poetry Version: [Please provide the output of poetry --version]
  • llm Version: 0.16
  • Installation Path: C:\Users\samue\AppData\Roaming\pypoetry\venv\Lib\site-packages\llm\
  • Virtual Environment Path: C:\Users\samue\AppData\Roaming\pypoetry\venv\Lib\site-packages\llm\llm-test\.venv

Environment Variables

$Env:PYTHONIOENCODING = "utf-8"
$Env:PYTHONUTF8 = "1"

Additional Context

  • Attempted Solutions:

    • Set PowerShell's output encoding to UTF-8.
    • Configured Python environment variables (PYTHONIOENCODING and PYTHONUTF8) to enforce UTF-8 encoding.
    • Activated the Poetry virtual environment within a separate project directory.
    • Tried adding and setting different models, including a local model (groq-llama3.1-70b), which resulted in errors.
  • Error Messages:

    Error: 'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128)
    
    Error: 'groq-llama3.1-70b' is not a known model
    
  • Commands Run:

    llm 'hi'
    llm 'Café'
    llm models add groq-llama3.1-70b --path "C:\Path\To\groq-llama3.1-70b"
    llm models set default groq-llama3.1-70b
    llm models list
    llm 'hi'
  • Attempts to Use llm-cmd:

    llm-cmd hi

    Result:

    llm-cmd : The term 'llm-cmd' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that
    the path is correct and try again.
    
  • Notes:

    • Attempting to run poetry shell within the site-packages\llm directory results in an error due to the absence of a pyproject.toml file.
    • The issue persists even after setting environment variables within the Poetry shell.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant