开发者

In mysql or postgres, is there a limit to the size of an IN (1,2,n) statement?

开发者 https://www.devze.com 2022-12-22 20:15 出处:网络
I\'ve got quite a few SQL statements like such: SELECT foo FROM things WHERE user_id IN (1,2,3..n) Is there a 开发者_StackOverflowknown limit to the number of elements that will safely fit in an IN

I've got quite a few SQL statements like such:

SELECT foo FROM things WHERE user_id IN (1,2,3..n)

Is there a 开发者_StackOverflowknown limit to the number of elements that will safely fit in an IN clause like that?


The 1000 limit in PostgreSQL is not a hard limit, it is a optimization limit, i.e; after 1000 PostgreSQL doesn't handle it very well. Of course I have to ask what in the world are you doing with a 1000 entry IN clause.


For MySQL, from the manual:

The number of values in the IN list is only limited by the max_allowed_packet value.


There is no technical limit, but there is a some kind of 'good sense' limit..

Having too much elements in the IN clause means the query have probably a bad design (imho)


I've used it, in dynamic queries created for postgres with sqlalchemy, with over 25k parameters. Also, they were passed to python functions via positional parameters (*args).... but I didn't notice a slowdown in my case. YMMV


No, but be careful when using the IN statement. If you use a subquery in your IN statement, performance might be negatively impacted because SQL Server must generate the entire result set and essentially build a potentially large IF statement internally.

For example, something like Select * From MyTable where MyColumn IN (Select myColumn from AnotherTable) might be somewhat slow if the subquery returns a large number of rows. Often times it's more efficient to use EXISTS.


I'm pretty sure Postgres has a limit of 1000....can't find any docco to support that though.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号