Quantum Computing

#2
by Reality123b - opened
DataMuncher Labs org

I have a quantum computer if anyone is interested in accessing it and using it for some science/AI work (roman specifically since you helped me a lot)

DataMuncher Labs org

lol ty, im kinda new to making ais, so i just get the data :> ok?

DataMuncher Labs org

Ok, ill give you the link in ~6-7 days

DataMuncher Labs org

Also, if you have a quantum computer, whats the processor speed lol :>

DataMuncher Labs org
edited Jan 19

10^6 gates/s, 10k qubits logical (lol jk, it's not 10k qubits try it out, you will find out) 10^-13 error rate

DataMuncher Labs org

so... idk what that means... but uhm how does one even get that ;-;

DataMuncher Labs org

Sorry couldn't respond yesterday ill respond later today. And I messed up the wording there, I couldn't think of other words at that time so it seems unprofessional.
Its "Quantum Processing Unit" not "Quantum Computer"

DataMuncher Labs org

but still, a QPU is crazy... im still out here on free hf infra and cloud.

DataMuncher Labs org

yeah mb... but still? why no arxiv paper? like... publish one on the workings of it and use that as promo

  • use it to train llm models, upload to hf, then mention the hardware...
DataMuncher Labs org

"use it to train llm models, upload to hf, then mention the hardware..."

Wait a sec that isnt a bad idea tbh i can make llm training way easier and advanced with the tradeoff of a learning rate multiplier then use the QPU to find the optimal learning rate and then we have a really good model.

DataMuncher Labs org

yk what that would also give you? a research paper + free promo. optimal learning rate? everyones gonna want that one! ngl i had an idea of doing

1 -> 0.5 -> 0.05 -> 0.005 -> 0.0005 -> 0.00005 -> 0.000005.
if the optimal weight is 3
it does 1 epoch per value, and therefore, would be a bit faster.

DataMuncher Labs org

Nice! I'll look into the matter and tell you the results

DataMuncher Labs org

Maybe how learning rate changes scaling?
So how does lr affect scaling, or optimizer, or wegith init, or seed.

DataMuncher Labs org

It affects the parameter update

DataMuncher Labs org

By that i meant...

should the same lr used on a 100m model be used on a 10b one?

DataMuncher Labs org

looks like i need a LOT of data so help pls and yes you are correct

Sign up or log in to comment