Search filter problems on site with big data feed

I’m a non-techie and new to these forums, so hope I’m not out of place with how I have framed the question.

The background is that our website takes a single feed for over 800,000 products, via an XML/API - that requires filtering by the user on the search results returned.

Because we do not want to have to manage frequent name changes ( which messes up the geo and product taxonomy’s - associated with landing pages for our SEO and PPC ), we have opted for a system that updates data on each user request. We have hit a major snag.

The filtering is painfully slow; it cannot handle more than a couple of requests each time.

Previously, on an earlier website we stored all of the data on our side, cached it, then only called for pricing. Filtering worked a treat. But we had major problems managing the data - and that was with just 60,000 products geo located.

Does anyone have experience on the best practices for balancing out the needs of managing huge amounts of data changes and providing a great UI/filter experience?

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.