* Respect the CtxNum setting for Ollama Models
Currently since the context window size isn't respected for Ollama models - the LLM does a naive truncation which removes important details. This leads to the model entering endless loops or making unsupported edits when operating as an agent.
* small change
* changeset
---------
Co-authored-by: 0xtoshii <94262432+0xToshii@users.noreply.github.com>