Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is the Google Sheets API rate limit open enough for actual production use?

I thought it was pretty restrictive, no more than 60 writes per minute, but I'm not sure about the reads restrictions.



I used Google Sheets as a data source that business people could update, but eventually we moved away from it as we found it unreliable. We would get an occasional error (maybe a 429) even though we were polling the sheet once a minute (we had a few other sheets that polled once every few minutes).

This manifested as an issue when doing a deploy but being unable to get critical data. We added retries and stuff like that but it seemed not great to run a business of something that isn’t designed for this purpose.


Perhaps the dreaded 503 Internal Error ?

I'm convinced most of the people in this thread haven't tried working much with Google Sheets API at scale. Most of the time it's fine, then it will have days where 30-40% of the calls (as measured by Google Cloud console API monitoring) will throw an internal error which Google advises the option for is to "try again later". Also API calls that take up to 4 minutes (?!) to return (again as measured by their own API monitoring tools in Cloud console).

It's too bad because I otherwise really like this approach.


Yes. I used Google sheets as a database to build a website and ran into this issue. The worse part is, if you come across the limit there’s not much you can do but wait or rate limit.

Another problem I had is an API change one year in.

I would not use Google Sheets again. Maybe I’d try Airtable, Notion, or some other similar platform where the API access is more of a priority to the company.


For reading sheets, it's better to use the "share as CSV" option since that gets cached pretty well w/o limits


I've resisted this temptation to integrate with google apis for these 2 specific reasons rate limits and api changes.


For now I`m setting no restrictions. Since it is an MVP, I`m trying to understand what a basic and a hard user would be. After a while, Ill figure out how to charge for it and what limitations should a free and a paid user have.

My Google API rate limit is way bigger then 60/minute.


Couldn't you cache the reads? Not many usages really require real-time from their data store.


Do you really want to deal with caching logic for what should be a simple API call? Sounds like a convincing argument to use whatever this product is.


It's max a couple of hours work to cache in some local database like sqlite or in memory.


client = (APIcall) => redis.get(sha1(APIcall)) || { res = api(APIcall); redis.set(sha1(APIcall),res) return res }

Not that hard. Like 10 lines of code to get a decentish cache going.


Assuming you have Redis


Redis uses like 5mb of baseline RAM and can be deployed in a few lines of docker-compose.


I'm not allowed to do that where I work. License is a no-no, can't run jobs without red tape, and there's no Docker either.


Ok? I'm surprised your work lets you build a whole product ontop of google sheets, then. Also, why did you delete your original comment on not having a server?


I deleted it cause I realized this thing has a server (probably). Was mixing it up with other people's projects that didn't have one.

They're internal tools, but big ones. And I'm surprised too. You won't hit too much resistance doing things the well-supported ways, but for some reason there's no well-supported way to run a cache.


Hell, just stick the data in memory.


Valid strategy




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: