Path: | doc/release_notes/3.24.0.txt |
Last Update: | Fri Jul 13 14:52:08 +0000 2012 |
Artist.plugin :prepared_statements Artist.create(:name=>'Foo') a = Artist[1] a.update(:name=>'Bar') a.destroy
The reason for this plugin is that Sequel‘s default behavior of using only the values specifically set when creating instances and having update only set changed columns by default can lead to a large number of prepared statements being created.
For prepared statements to be used, each set of columns in the insert and update statements needs to have its own prepared statement. If you have a table with 1 primary key column and 4 other columns, you can have up to 2^4 = 16 prepared statements created, one for each subset of the 4 columns. If you have 1 primary key column and 20 other columns, there are over a million subsets, and you could hit your database limit for prepared statements (a denial of service attack).
Using the prepared_statements_safe plugin mitigates this issue by reducing the number of columns that may or may not be present in the query, in many cases making sure that each model will only have a single INSERT and a single UPDATE prepared statement.
Artist.plugin :prepared_statements_associations Artist.many_to_one :albums Artist[1].albums
Will use a prepared statement to return the albums for that artist. This plugin works for all supported association types. There are some associations (filtered and custom associations) that Sequel cannot currently use a prepared statement reliably, for those Sequel will use a regular query.
Artist.plugin :prepared_statements_with_pk Artist.filter(...).with_pk(1)
Will use a prepared statement for this query. The most benefit from prepared statements come from queries that are expensive to parse and plan but quick to execute, so using this plugin with a complex filter can in certain cases yield significant performance improvements.
However, this plugin should be considered unsafe as it is possible that it will create an unbounded number of prepared statements. It extracts parameters from the dataset using Dataset#unbind (explained below), so if your code has conditions that vary per query but that Dataset#unbind does not handle, an unbounded number of prepared statements can be created. For example:
Artist.filter(:a=>params[:b].to_i).with_pk[1] Artist.exclude{a > params[:b].to_i}.with_pk[1]
are safe, but:
Artist.filter(:a=>[1, params[:b].to_i]).with_pk[1] Artist.exclude{a > params[:b].to_i + 2}.with_pk[1]
are not. For queries that are not safe, Dataset#with_pk should not be used with this plugin, you should switch to looking up by primary key manually (for a regular query):
Artist.filter(:a=>[1, params[:b].to_i])[:id=>1]
or using the prepared statement API to create a custom prepared statement:
# PS = {} PS[:name] ||= Artist.filter(:a=>[1, :$b], :id=>:$id). prepare(:select, :name) PS[:name].call(:b=>params[:b].to_i, :id=>1)
Album.filter(:artist=>artist)
This capability is much expanded in 3.24.0, allowing you to exclude by associations:
Album.exclude(:artist=>artist)
This will match all albums not by that artist.
You can also filter or exclude by multiple associated objects:
Album.filter(:artist=>[artist1, artist2]) Album.exclude(:artist=>[artist1, artist2])
The filtered dataset will match all albums by either of those two artists, and the excluded dataset will match all albums not by either of those two artists.
You can also filter or exclude by using a model dataset:
Album.filter(:artist=>Artist.filter(:name.like('A%'))).all Album.exclude(:artist=>Artist.filter(:name.like('A%'))).all
Here the filtered dataset will match all albums where the associated artist has a name that begins with A, and the excluded dataset will match all albums where the associated artist does not have a name that begins with A.
All of these types of filtering and excluding work with all of association types that ship with Sequel, even the many_through_many plugin.
class Artist < Sequel::Model def around_save super rescue Sequel::DatabaseError => e # log the error raise end end
All around hooks should call super, not yield. If an around hook doesn‘t call super or yield, it is treated as a hook failure, similar to before hooks returning false.
For around_validation, the return value of super should be whether the object is valid. For other around hooks, the return value of super is currently true, but it‘s possible that will change in the future.
Artist.filter(:name.like('A%')).with_pk(1)
This should make easier the common case where you want to find a particular object that is associated to another object:
Artist[1].albums_dataset.with_pk(2)
Before, there was no way to do that without manually specifying the primary key:
Artist[1].albums_dataset[:id=>2]
To use a composite primary key with with_pk, you have to provide an array:
Artist[1].albums_dataset.with_pk([1, 2])
Artist[1].albums_dataset[2]
Note that for backwards compatibility, this only works for single integer primary keys. If you have a composite primary key or a string/varchar primary key, you have to use with_pk.
ds, bv = DB[:table].filter(:a=>1).unbind ds # DB[:table].filter(:a=>:$a) bv # {:a=>1}
The purpose of doing this is that you can then use prepare or call on the returned dataset with the returned bound variables:
ds.call(:select, bv) # SELECT * FROM table WHERE (a = ?); [1] ps = ds.prepare(:select, :ps_name) # PREPARE ps_name AS SELECT * FROM table WHERE (a = ?) ps.call(bv) # EXECUTE ps_name(1)
Basically, Dataset#unbind takes a specific statement and attempts to turn it into a generic statement, along with the placeholder values it extracted.
Unfortunately, Dataset#unbind cannot handle all cases. For example:
DB[:table].filter{a + 1 > 10}.unbind
will not unbind any values. Also, if you have a query with multiple different values for a variable, it will raise an UnbindDuplicate exception:
DB[:table].filter(:a=>1).or(:a=>2).unbind
Album.plugin :default_values Album.new.values # {:copies_sold => 0} Album.default_values[:copies_sold] = 42 Album.new.values # {:copies_sold => 42}
Before, this was commonly done in an after_initialize hook, but that‘s slower as it is also called for model instances loaded from the database.
The Database#supports_create_table_if_not_exists? method was added to allow users to determine whether this syntax is supported.
Sequel::NULL # NULL ~Sequel::NULL # NOT NULL Sequel::TRUE # TRUE ~Sequel::TRUE # FALSE
Sequel::Postgres::PG_NAMED_TYPES[:interval] = proc{|v| ...}
The conversion procs now use a separate hash per Database object instead of a hash shared across all Database objects. You can now modify the types for a particular Database object, but you have to use the type oid:
DB.conversion_procs[42] = proc{|v| ...}
DB[:table].filter(true) DB[:table].filter(false)
Unfortunately, because SQLite and MSSQL don‘t have a real boolean type, these will not work:
DB[:table].filter{a & true} DB[:table].filter{a & false}
You currently have to work around the issue by doing:
DB[:table].filter{a & Sequel::TRUE} DB[:table].filter{a & Sequel::FALSE}
It is possible that a future version of Sequel will remove the need for this workaround, but that requires having a separate literalization method specific to filters.