Dhanishtha-2.0-preview is the world's first model to use Intermediate reasoning, which basically means reasoning in middle of responses. This approach makes reasoning/COT models more time and token efficient. Thus, lowering their cost.