Ruby:SQLite3::BusyException:数据库被锁定:

Ruby:SQLite3::BusyException:数据库被锁定:

问题描述:

今晚开发时遇到此错误消息:SQLite3::BusyException: database is locked:

Ran into this error message whilst developing tonight: SQLite3::BusyException: database is locked:

我有两个模型:

  • 播客有很多曲目
  • 曲目属于播客.
  • 播客文件托管在 mixcloud 上.

要创建播客:

  • 用户在 mixcloud 上提交播客网址
  • rails 应用程序抓取与 url 关联的 json 提要
  • json 用于在新的 Podcast 对象上设置属性(标题、图片等)

我正在尝试让我的 rails 应用程序利用这样一个事实,即 json 提要还详细说明了属于此 Podcast 的曲目的名称(和艺术家).

I'm trying to get my rails app to take advantage of the fact that the json feed also details the names (and artists) of the Tracks that belong to this Podcast.

我认为每当我们创建新的 Podcast 时,以下 before_validation 方法都会自动创建所有关联的曲目.

I thought the following before_validation method would automatically create all associated Tracks whenever we create a new Podcast.

class Podcast < ActiveRecord::Base
  attr_accessible :mixcloud_url, :lots, :of, :other, :attrs
  has_many :tracks    
  before_validation :create_tracks
  def create_tracks
    json = Hashie::Mash.new HTTParty.get(self.json_url)    
    json.sections.each do |section|
      if section.section_type=="track"
          Track.create(:name=>section.track.name, :podcast_id=>self.id)
      end
    end             
  end
end

我怎样才能解决这个问题?看起来rails(或sqlite3)不喜欢我以这种方式创建关联模型的新实例.我还能怎么做?我怀疑这和 sqlite3 一样是一个 rails 问题.如果有帮助,我可以发布更多代码.

How can I get round this? It looks like rails (or sqlite3) doesn't like me creating new instances of an associated model in this way. How else can I do this? I suspect this is as much a rails problem as an sqlite3 one. I can post more code if it's gonna help.

对于在 Rails 控制台打开时在开发中遇到 SQLite 锁定问题的其他任何人,请尝试以下操作:

For anyone else encountering this issue with SQLite locking in development when a Rails console is open, try this:

只需运行以下命令:

ActiveRecord::Base.connection.execute("BEGIN TRANSACTION; END;")

无论如何,对我来说,它似乎清除了控制台持有的任何事务并释放了数据库.

For me anyway, it appears to clear any transaction that the console was holding onto and frees up the database.

这对我来说在运行 delay_job 时尤其成问题,它似乎经常无法关闭事务.

This is especially a problem for me when running delayed_job, which seems to fail at closing the transaction quite often.