Benjamin Consolvo
		
	commited on
		
		
					Commit 
							
							·
						
						5be51ca
	
1
								Parent(s):
							
							ec12af2
								
README updates
Browse files
    	
        README.md
    CHANGED
    
    | @@ -34,15 +34,15 @@ This application provides a user-friendly interface to interact with various LLM | |
| 34 | 
             
            ### Setup
         | 
| 35 |  | 
| 36 | 
             
            1. Clone the repository:
         | 
| 37 | 
            -
             | 
| 38 | 
            -
            git clone https:// | 
| 39 | 
            -
            cd  | 
| 40 | 
            -
             | 
| 41 |  | 
| 42 | 
             
            2. Install the required packages:
         | 
| 43 | 
            -
             | 
| 44 | 
             
            pip install -r requirements.txt
         | 
| 45 | 
            -
             | 
| 46 |  | 
| 47 | 
             
            ## Configuration
         | 
| 48 |  | 
| @@ -56,9 +56,9 @@ This application requires API credentials to be set up in Streamlit's secrets ma | |
| 56 |  | 
| 57 | 
             
            2. For local development:
         | 
| 58 | 
             
               - Create a `.streamlit/secrets.toml` file with:
         | 
| 59 | 
            -
             | 
| 60 | 
             
            openai_apikey = "your-api-key-here"
         | 
| 61 | 
            -
             | 
| 62 | 
             
               - Set the `base_url` environment variable to point to your model endpoint with hosted models.
         | 
| 63 |  | 
| 64 | 
             
            ## Running the Application
         | 
| @@ -72,7 +72,7 @@ https://huggingface.co/spaces/Intel/intel-ai-enterprise-inference. | |
| 72 |  | 
| 73 | 
             
            Run the Streamlit application locally:
         | 
| 74 |  | 
| 75 | 
            -
            ```
         | 
| 76 | 
             
            streamlit run app.py
         | 
| 77 | 
             
            ```
         | 
| 78 |  | 
|  | |
| 34 | 
             
            ### Setup
         | 
| 35 |  | 
| 36 | 
             
            1. Clone the repository:
         | 
| 37 | 
            +
            ```bash
         | 
| 38 | 
            +
            git clone https://huggingface.co/spaces/Intel/intel-ai-enterprise-inference
         | 
| 39 | 
            +
            cd intel-ai-enterprise-inference
         | 
| 40 | 
            +
            ```
         | 
| 41 |  | 
| 42 | 
             
            2. Install the required packages:
         | 
| 43 | 
            +
            ```bash
         | 
| 44 | 
             
            pip install -r requirements.txt
         | 
| 45 | 
            +
            ```
         | 
| 46 |  | 
| 47 | 
             
            ## Configuration
         | 
| 48 |  | 
|  | |
| 56 |  | 
| 57 | 
             
            2. For local development:
         | 
| 58 | 
             
               - Create a `.streamlit/secrets.toml` file with:
         | 
| 59 | 
            +
            ```toml
         | 
| 60 | 
             
            openai_apikey = "your-api-key-here"
         | 
| 61 | 
            +
            ```
         | 
| 62 | 
             
               - Set the `base_url` environment variable to point to your model endpoint with hosted models.
         | 
| 63 |  | 
| 64 | 
             
            ## Running the Application
         | 
|  | |
| 72 |  | 
| 73 | 
             
            Run the Streamlit application locally:
         | 
| 74 |  | 
| 75 | 
            +
            ```bash
         | 
| 76 | 
             
            streamlit run app.py
         | 
| 77 | 
             
            ```
         | 
| 78 |  | 
