who was this guy?
EDIT: changed “whose” to “who was”
who was this guy?
EDIT: changed “whose” to “who was”


my names maria.,… doesnt show up in infinity and som other clients, i kno… evil clients.,., I changed my name months ago by now,


im happy for this person-
woa… what a… place-


heheheee yea… ;(
id much rather have a cutie toaster instead— but like- no ai, just a cutie face with like - frills and stuff,.


se ;(
i wana have my leotard cutie in my keyboard - rightnow!
u cant just say that so generally - tell us ur kink!
(if u feel comf with it, ive never been so forward on lemmy)
i only ever tapped my card---- despite some times where it didnt work-
anyway - this is almost certainly a joke kinda post ~ ~


water u talkin about?.. i was geniunely askin whats up with this post - there r some symbols i feel one already needs to kno to understand it… u kno? or maybe some stereotypes im unbeknownst of ~~


i dont understand anything…


next appointment is tomorrow. i will try my best.
thank u very much, dear hildegarde von Bingen —
i got som… chill pills from father now… will maybe use them. just tried one and it works reasonably well…
anyway, thank u for sharing that comment, its nice knowing that this is a reasonably common thing.,…
<3<3<3
aawawawwwwwwwawwwaqaa how do u do? ~ ~ ~ ~<3
awwawawarwarwarrrwawrwaa edna bricht aus >v< hey hi ednaaaaa <3<3
dogs really r popular, hm?..
AAAAAAAAAAAAAA (its monday, how did u kno?)


whos daylight is bein saved?..


mygosh - ive also been suspected of bein some LM multiple times,… even in 196 chat…
bad grls dun get cream pie. only gud grls get pie 🥧 ☕ 🍫 ~ ~
sits and watches to see if ur gud
(very much hoping to see u being gud)
woaaaaaaaa deepcut chain of thought reference - no wayyyy–!!!
i - cannot - believe it ----- next up yall start talkin bout the lonely left-out mixture of experts expert whos never used for anything >o< woaaaaa thatd be crazy–
or - or- or- or- about how supersparse MoEs appear much better in inference performance and training procedures - just like the brain—
or or or orrrrr. ----orororor----- - - - - or how like - laying out specific reasoning traces as text determines surprisingky accurately how any given model reaches a conclusion, again, verx similar to what humans in higher ages having a harder time to learn new stuff as they appear to have more rigid reasoning patterns which tend to accell only in very narrow domains -
orrrrrrrrrrrrrrrrorrrororrroroooorrrrorororrr or or or o r or rooroo4 ororororor or or or or or or maybe about how pretraining data appears to define a models reasoning ability and quality much more than the actual reasoning post-training does----- similar to how some peeps have a much more difficult time aquiring knowledge than others do, as they may simply have specialized their mind into a different subset of abilities-
orrrrrrrrrrrrr how training a model with inherent knowledge and abilities is muuuuuuch more expensive than actually running it does —just like how evolving any species from the very beginning towards their current state took millions or trillions or iduno how many failed attempts to now have some human which struggles with their new environments — (somthin we would call a distrobutional shift in model training, but noooooo thats a totally different thing cuz humns totally diffrt)
orrorror how like - the number of parameters to actual performance equasion in not linear, but logarithmic, implying that scaling a given model to a larger size returns much higher boosts to intelligence than doing so on a larger scale with a larger model does— maybe implying, that also in real life, there is an optimal brain size at whivh deminishing returns yields no substantial gain over its previous smaller iteration???
thatd be crazy!! :ooooooooo