Spaces:
				
			
			
	
			
			
					
		Running
		
	
	
	
			
			
	
	
	
	
		
		
					
		Running
		
	google/gemma-3-27b-it running into token limit
#719
by
						
auerovkh
	
							
						- opened
							
					
When using google/gemma-3-27b-it, I've noticed I run into the error
Input validation error:
inputstokens +max_new_tokensmust be <= 4096. Given: 4182inputstokens and 0max_new_tokens
After 3-4 messages. Should/can this be increased?
Ive noticed this too. Is there no way to trim older data as the conversation grows to make room? None of the other models have this issue.
hi

