To test:
from diffusers.modular_pipelines import ModularPipelineBlocks
expander_qwen = ModularPipelineBlocks.from_pretrained(
    "diffusers-internal-dev/qwen-prompt-expander",
    trust_remote_code=True,
)
expander_qwen = expander_qwen.init_pipeline()
output = expander_qwen(prompt="a dog sitting by the river, watching the sunset")
print(f"{output.values['prompt']=}")
	Inference Providers
	NEW
	
	
	This model isn't deployed by any Inference Provider.
	๐
			
		Ask for provider support