开发者

How to increase query speed without using full-text search?

开发者 https://www.devze.com 2023-02-02 14:17 出处:网络
This is my simple query; By searching selectnothing I\'m sure I\'ll have no hits. SELECT nome_t FROM myTable WHERE nome_t ILIKE \'%selectnothing%\';

This is my simple query; By searching selectnothing I'm sure I'll have no hits.

SELECT nome_t FROM myTable WHERE nome_t ILIKE '%selectnothing%';

This is the EXPLAIN ANALYZE VERBOSE

Seq Scan on myTable  (cost=0.00..15259.04 rows=37 width=29) (actual time=2153.061..2153.061 r开发者_如何学Pythonows=0 loops=1)
  Output: nome_t
  Filter: (nome_t ~~* '%selectnothing%'::text)
Total runtime: 2153.116 ms

myTable has around 350k rows and the table definition is something like:

CREATE TABLE myTable (
    nome_t text NOT NULL,
)

I have an index on nome_t as stated below:

CREATE INDEX idx_m_nome_t ON myTable
USING btree (nome_t);

Although this is clearly a good candidate for Fulltext search I would like to rule that option out for now.

This query is meant to be run from a web application and currently it's taking around 2 seconds which is obviously too much;

Is there anything I can do, like using other index methods, to improve the speed of this query?


No, ILIKE '%selectnothing%' always needs a full table scan, every index is useless. You need full text search, it's not that hard to implement.


Edit: You could use a Wildspeed, I forgot about this option. The indexes will be huge, but your performance will also be much better.

Wildspeed extension provides GIN index support for wildcard search for LIKE operator.

http://www.sai.msu.su/~megera/wiki/wildspeed


another thing you can do-- is break this nome_t column in table myTable into it's own table. Searching one column out of a table is slow (if there are fifty other wide columns) because the other data effectively slows down the scan against that column (because there are less records per page/extent).

0

精彩评论

暂无评论...
验证码 换一张
取 消