Hacker Newsnew | past | comments | ask | show | jobs | submit | floridianfisher's commentslogin

The purpose is research


Benchmarks are on the model card https://huggingface.co/google/gemma-3-270m-it


According go the Gemma 3n preview blog, Gemma 3n shares the same architecture as the upcoming version of Gemini Nano.

The ‘n’ presumably stands for Nano.

Nano is a proprietary model that ships with Android. Gemma is an open model that can be adapted and used anywhere.

Sources: https://developers.googleblog.com/en/introducing-gemma-3n/

Video in the in the blog linked in this post


Try one of the smaller versions. 27b is too big for your gpu


I'm aware. I was addressing the question being asked.


The application of modern LLM technology to animal communication is exciting.


Why not try it here and make your comparisons that way? https://aistudio.google.com/app/prompts/new_chat?model=gemma...


One compelling reason not to would be a region block... [0]

https://ai.google.dev/gemini-api/docs/available-regions




I'm not sure this link is very relevant unless you're trying to posit that popularity ~= quality in some way?

If you're interested in further testament to its popularity, builtwith.com is always a fun read. I believe about 20% of the internet runs WordPress, a number which grows to approach 70% if you exclude sites not using off-the-shelf cms software.


You don't like the endless scroll?


No.


Sounds like it couldn't find an image. Working on a fix for that now.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: