MySQL的触发器会比php更快地检查20多行吗?
At the moment we record a users last 20 media views so that they can have a brief history page of what they've looked at recently.
To do this we insert the media ids into a table.
To keep the table a small size it's currently only 20 items per user so before inserting the row we check to see how many history ids they currently have in there, if it's less than 20 then we just insert the new row, if it's more than 20 then we have to select the last row and delete it before the new row can be inserted.
$historyResult = mysql_query("SELECT id FROM mediatable WHERE userId = $userId ORDER BY id ASC");
if(mysql_num_rows($historyResult) >= 20)
{
$historyResult = mysql_query("SELECT id FROM mediatable WHERE userId = $userId ORDER BY id ASC LIMIT 1");
$historyRow = mysql_fetch_row($historyResult);
mysql_query("DELETE FROM mediatable WHERE id = '$historyRow[0]'");
}
mysql_query("INSERT INTO mediatable (userId, mediaId) VALUES ($userId, $mediaId))");
I'm now converting the site to more modern code and will be using pdo queries, my quesiton is:
Is the above a good way to approach this or should I use another method such as a MySQL trigger on insert?
目前我们会记录用户最近20个媒体观点,以便他们可以拥有一个简短的历史记录页面 我们最近看了一下。 p>
要做到这一点,我们将媒体ID插入表中。 p>
为了保持表的小尺寸,它目前只有20个 每个用户的项目所以在插入行之前我们检查它们当前有多少历史ID,如果它小于20那么我们只是插入新行,如果它超过20那么我们必须选择最后一行并删除 它可以插入新行之前。 p>
$ historyResult = mysql_query(“SELECT id FROM mediatable WHERE userId = $ userId ORDER BY id ASC”);
if( mysql_num_rows($ historyResult)> = 20)
{
$ historyResult = mysql_query(“SELECT id FROM mediatable WHERE userId = $ userId ORDER BY id ASC LIMIT 1”);
$ historyRow = mysql_fetch_row($ historyResult);
mysql_query(“DELETE FROM mediatable WHERE id ='$ historyRow [0]'”);
}
mysql_query(“INSER T INTO mediatable(userId,mediaId)VALUES($ userId,$ mediaId))“);
code> pre>
我现在正在将网站转换为更现代的代码,并将 使用pdo查询,我的问题是: p>
上面是一个很好的方法来解决这个问题,还是应该使用其他方法,例如插入时的MySQL触发器? p>
If you are really interested in maximum speed, there's also the extremely performant option of not checking how many rows are there in the first place and having a cleanup operation prune them in the background. Whenever you want to insert a new row, simply do so directly. Whenever you fetch the most recent rows to display, use LIMIT 20
.
A garbage collector process can run independently of this as often as you want it to, looking for users with more than 20 history records and deleting the oldest appropriately. This way you have a guaranteed lower bound for recent history items (provided they have been created at some point) and an unlimited but quite low in practice amount of old history records awaiting pruning. You can decide on the collection frequency as you prefer.
On the other hand, if it's not absolute performance that you are after then a check from PHP should be fine -- provided that you change it to do SELECT COUNT(*) FROM mediatable WHERE userId = $userId
when looking how many ids already exist. This way it's also easier by far to modify your retention strategy down the road.
Update: Garbage collection strategies to consider
- Scheduled GC: Collection happens periodically, most likely on a fixed schedule. Typically this is implemented as a cron job or scheduled task, depending on OS. Advantages include reliability and the option to use historical data to significantly decrease the standard deviation of items waiting to be pruned. Additionally, cleanup is not associated with (and should not block) any one user request; this eliminates the (typically rare) occasion where a user has to wait for GC to be completed before their content becomes available.
- Probabilistic GC: Collection is triggered randomly. Your application specifies a set of actions that can trigger GC (e.g. creating a new history item) and a probability for GC to be triggered whenever any action in the set occurs; collection can happen as part of processing the action, or the collector can be a separate process that is launched on demand. The most significant advantage is that no setup external to your application is required, which may not be as important if your app is not meant to be widely distributed. Adjusting the GC probability while taking into account the mean number of triggering actions per unit of time allows you to target (albeit coarsely) a desired interval between successive collections.
The best you can do is , make one stored procedure, just pass it your new data, and max length
max length because, now its 20, going larger, may need to be changed, so dynamic is better
Triggers are also good, but Precompiled Procedures, would better when comparing in term of speed and customization.
The following will insert a new row or reuse an existing id if a user has more than 20 rows inserted already. The query uses the oldest existing id for this user.
mysql_query("
INSERT INTO mediatable (id, userId, mediaId)
SELECT m.idd, d.u, d.m
FROM
(SELECT " . $userId . " u, " . $mediaId . " m FROM dual) d LEFT JOIN (
SELECT userid u, MIN(id) idd FROM mediatable
WHERE
userid = " . $userId . "
GROUP BY
userid
HAVING COUNT(*) > 20
) m ON d.u = m.u
");