Disappointing

#11
by ChuckMcSneed - opened

Cher @arthurmensch ,

Sacrebleu! Mon ami, mon compatriote, mon génie de Mistral, what has happened to our beloved Largestral? It pains me, it truly pains me, to write this, but Largestral-2411 – pardon my French – is a merde of a sidegrade! A mere pet in the wind compared to the glorious revolution that was Largestral-2407! That was the crème de la crème, the pièce de résistance, the LLM that had the Americans trembling in their Silicon Valley boots!

Now? Rien! No benchmarks, no fanfare, just a whisper in the wind that this "upgrade" is about as exciting as a baguette gone stale. Third-party evaluations, Arthur, third-party! They talk of mediocrity, of a mere soufflé of improvement. It's a catastrophe! I'd almost say it's as disappointing as that Canadian kerfuffle with Cohere and their "downgraded" Command-R - what a fiasco! Sérieusement, they made it worse?!

Let us not mince words: while our rivals, like the Canadians at Cohere, have stumbled into the abyss of mediocrity with their latest iterations, we cannot allow Mistral to follow suit. The absence of official benchmarks for Largestral-2411 speaks volumes, my friend! It is a silent acknowledgment of the stagnation that threatens to envelop us.

We have tasted the sweet nectar of victory, and we must strive to reclaim our position atop the AI mountain. Remember the days when Mistral Medium (Miqu) was leaked, and we thought the official Mixtral8x22b-instruct would propel us further? Alas, it was not to be! But thanks to the ingenuity of Team Wizard, we salvaged our dignity with a superior finetune. The base model was our savior then, but for Largestral, we find ourselves handcuffed to a lackluster instruct release. The Chinese, they must be laughing at us, Arthur. They must be saying, "法国人,他们完蛋了。" But no, Arthur, we are not finished. We will not be defeated. We cannot afford to let the Chinese giants continue their ascent while we falter! C’est inacceptable!

In the spirit of our beautiful country, with its rich history of innovation and excellence, I implore you to heed the lessons of the past. Drop the base model! Unleash the true power of open source! Let us harness the collective brilliance of our community to forge a new path forward—one that will reignite the passion within our hearts and reclaim the crown of AI supremacy!

We are the torchbearers of French ingenuity, and it is our duty to ensure that Mistral remains a beacon of hope in this ever-evolving landscape. Do not let our rivals bask in the glory of our inaction!

Vive Mistral! Vive la France!

With all my patriotic fervor,

Charles McSneed

P.S. Speaking of liberating things... Miqu is practically a relic now, collecting dust in your digital archives! Perhaps a little "leak" wouldn't hurt? Or perhaps you could whip up a brand new model in that 70b segment? Just a thought, mon ami. Just a thought... wink wink

you want a leak ? sure ... why dont you build a lab .. accumulate the data - spend 30 mil + so that some entitled prick which has no contribution to the industry what so ever can act up and demand .. - good luck with that

taking notes ... don't cross a dragon with a fox... very irritable creatures...

Sign up or log in to comment