Maintenance of rxsiteitems table?

We’re running into some problems with blocking transactions on our repository during publishing. The culprit seems to be a query that is hitting rxsiteitems, and when I took a look at the table, I realized that it has 51,000+ rows in it.

Is there are recommended process for pruning this table to a manageable level? If I remove old entries, I may trigger some unneeded incremental publishing. But letting it grow indefinitely is clearly a bad idea.

Does anyone have any suggestions on this?

-Kathleen

We delete all related records from that table before every full publish. Not sure if this helps you.

The site item table is designed to record last published items. The size of it is limited to the number of sites and items in each site. So it should not grow indefinately.

My experience with rxsiteitems was during problems with the publishing process hanging. Apparently on a publish run, an attempt will be made to publish failed items. We now housekeep failed items and it doesn’t have any detriment.

If you run this query against the db:

select count(*) num_items,
pubstatus
from rxsiteitems
group by pubstatus

Do you have a significant no. of failed items? Obviously if you do go down the road of deleting failed items…make a backup of rxsiteitems.

Hi Kathleen

You can also run the following query which will delete all content items from the rxsiteitems where they are no longer contained in the system.

DELETE FROM rxsiteitems WHERE contentid NOT IN (SELECT contentid from CONTENTSTATUS)

Just check beforehand that it’s returning the correct results set by switching the DELETE to SELECT eg

SELECT * FROM rxsiteitems WHERE contentid NOT IN (SELECT contentid from CONTENTSTATUS)

Cheers
James