forked from e621ng/e621ng
updated tag unit tests
This commit is contained in:
parent
2f3a6e4a8b
commit
bed94a4e30
1
.gitignore
vendored
1
.gitignore
vendored
@ -2,6 +2,7 @@
|
||||
lib/danbooru_image_resizer/*.so
|
||||
lib/danbooru_image_resizer/*.o
|
||||
lib/danbooru_image_resizer/*.bundle
|
||||
lib/danbooru_image_resizer/*.log
|
||||
db/*.sqlite3
|
||||
log/*.log
|
||||
tmp/**/*
|
||||
|
132
README
132
README
@ -1,55 +1,121 @@
|
||||
=== Installation
|
||||
|
||||
It is recommended that you install Danbooru on a Debian-based system (Lenny or newer) since most of the required packages are available on APT. Although Danbooru has been successfully installed on Fedora, CentOS, FreeBSD, and OS X, the following instructions will assume you're installing on Debian. The Debian install script is straightforward and should be simple to adapt for other platforms. Install docs for other platforms are provided, but these are user contributed and may not be up to date. If you want something similar to Danbooru that is easier to install, try Shimmie (http://trac.shishnet.org/shimmie2). Shimmie uses PHP and MySQL and should be straightforward to install on most hosts.
|
||||
It is recommended that you install Danbooru on a Debian-based system (Lenny or
|
||||
newer) since most of the required packages are available on APT. Although
|
||||
Danbooru has been successfully installed on Fedora, CentOS, FreeBSD, and OS X,
|
||||
the following instructions will assume you're installing on Debian. The Debian
|
||||
install script is straightforward and should be simple to adapt for other
|
||||
platforms. Install docs for other platforms are provided, but these are user
|
||||
contributed and may not be up to date. If you want something similar to
|
||||
Danbooru that is easier to install, try Shimmie
|
||||
(http://trac.shishnet.org/shimmie2). Shimmie uses PHP and MySQL and should be
|
||||
straightforward to install on most hosts.
|
||||
|
||||
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger; the main Danbooru database takes up around 1GB of memory by itself.
|
||||
For best performance, you will need at least 256MB of RAM for PostgreSQL and
|
||||
Rails. The memory requirement will grow as your database gets bigger; the main
|
||||
Danbooru database takes up around 1GB of memory by itself.
|
||||
|
||||
- Danbooru has the following general dependencies: gcc, g++, make, readline, zlib, flex, bison, gd2, bzip2, postgresql-8.4, postgresql-contrib-8.4, ruby, rubygems, memcached, subversion, apache, and phusion passenger. Use your operating system's package management system whenever possible. This will simplify the process of installing init scripts, which will not always happen when compiling from source.
|
||||
- Please read the section below about PostgreSQL and test_parser before proceeding.
|
||||
- Danbooru has the following Ruby gem dependencies: pg, diff-lcs, html5, memcache-client, aws-s3, json, rails (version 3.0)
|
||||
- You may need to specify the path to your PostgreSQL libraries and includes when building the postgres gem. The general format for this is: "gem install postgres -- --with-pgsql-dir=/usr/local/pgsql". Experiment with the other configure settings if this doesn't work.
|
||||
- It's recommended you create a dedicated account for running the Danbooru database and/or web processes. If you go this route:
|
||||
- Use the createuser command while logged in as postgres to grant database access to the danbooru account.
|
||||
- You will need to update the pg_hba.conf file to grant your danbooru account trusted localhost access. Make sure to restart the database server (/etc/init.d/postgresql-8.3 restart) after making any changes.
|
||||
- You now have to check out the Danbooru source code. It's recommended you create it in the /var/www directory, but you can put the code anywhere.
|
||||
- To export from Subversion: "svn export svn://donmai.us/danbooru/trunk danbooru"
|
||||
- Recursively change the owner of this directory to the danbooru account: "chown -R danbooru:danbooru danbooru"
|
||||
- Create a public/data/sample directory.
|
||||
- Compile the resizer at lib/danbooru_image_resizer: "ruby extconf.rb && make". Do not make install it. If this fails you will need to figure out your gd2/libjpeg/libpng dependencies.
|
||||
- Create new database.yml and local_config.rb files in the config directory. Example files are provided.
|
||||
- Create the database: "createdb danbooru"
|
||||
- Load the schema: "psql danbooru < db/postgres.sql"
|
||||
- Run the migrations: "RAILS_ENV=production rake db:migrate"
|
||||
- Start the job daemon: "RAILS_ENV=production app/daemons/job_task_processor_ctl.rb start"
|
||||
- You now need a way of managing the Rails process. The preferred method is using the Phusion Passenger module (see section below). Alternatively you can use Mongrel or fastcgi, there are several examples on the web.
|
||||
- You should now be able to connect to your Danbooru instance. The first account you create will automatically become the administrator, so you should do this first.
|
||||
Danbooru has the following general dependencies: gcc, g++, make, readline,
|
||||
zlib, flex, bison, gd2, bzip2, postgresql-8.4, postgresql-contrib-8.4,
|
||||
ruby1.9, rubygems, memcached, subversion, nginx, and phusion passenger.
|
||||
|
||||
Use your operating system's package management system whenever possible.
|
||||
This will simplify the process of installing init scripts, which will not
|
||||
always happen when compiling from source.
|
||||
|
||||
Please read the section below about PostgreSQL and test_parser before
|
||||
proceeding.
|
||||
|
||||
It's recommended you create a dedicated account for running the Danbooru
|
||||
database and/or web processes. If you go this route:
|
||||
- Use the createuser command while logged in as postgres to grant database
|
||||
access to the danbooru account.
|
||||
- You will need to update the pg_hba.conf file to grant your danbooru
|
||||
account trusted localhost access. Make sure to restart the database server
|
||||
(/etc/init.d/postgresql restart) after making any changes.
|
||||
|
||||
You now have to check out the Danbooru source code. It's recommended you
|
||||
create it in the /var/www directory, but you can put the code anywhere.
|
||||
To export from Git: git clone git://github.com/r888888888/danbooru.git
|
||||
|
||||
Recursively change the owner of this directory to the danbooru account:
|
||||
chown -R danbooru:danbooru danbooru
|
||||
|
||||
Compile the resizer at lib/danbooru_image_resizer: ruby extconf.rb && make
|
||||
|
||||
Create new database.yml and danbooru_local_config.rb files in the config
|
||||
directory. Example files are provided.
|
||||
|
||||
Create the database: createdb danbooru
|
||||
|
||||
Load the schema: psql danbooru < db/development_structure.sql
|
||||
|
||||
Start the job daemon: RAILS_ENV=production
|
||||
app/daemons/job_task_processor_ctl.rb start
|
||||
|
||||
You now need a way of managing the Rails process. The preferred method is
|
||||
using the Phusion Passenger module (see section below). Alternatively you
|
||||
can use Mongrel or fastcgi, there are several examples on the web.
|
||||
|
||||
You should now be able to connect to your Danbooru instance. The first
|
||||
account you create will automatically become the administrator, so you
|
||||
should do this first.
|
||||
|
||||
=== PostgreSQL and test_parser
|
||||
|
||||
Starting with version 1.16, Danbooru relies on PostgreSQL's full text search feature to speed up tag queries. The gains are especially noticeable on tags with large post counts and for multi-tag joins. Unfortunately in order to adapt it for Danbooru a custom parser is required.
|
||||
Starting with version 1.16, Danbooru relies on PostgreSQL's full text search
|
||||
feature to speed up tag queries. The gains are especially noticeable on tags
|
||||
with large post counts and for multi-tag joins. Unfortunately in order to
|
||||
adapt it for Danbooru a custom parser is required.
|
||||
|
||||
The easiest way of doing this on Debian is installing the the postgresql-contrib-8.4 package. You should do this prior to running the Danbooru database migrations.
|
||||
The easiest way of doing this on Debian is installing the the
|
||||
postgresql-contrib-8.4 package. You should do this prior to running the
|
||||
Danbooru database migrations.
|
||||
|
||||
=== Apache and Phusion Passenger
|
||||
=== Nginx and Phusion Passenger
|
||||
|
||||
Phusion Passenger is essentially mod_rails, a compiled module for Apache that is similar in functionality to fastcgi. It is used instead of fastcgi or Mongrel to proxy requests between Rails processes that Passenger manages. When used in conjunction with Ruby Enterprise Edition you can see improved performance and memory efficiency. Passenger also makes deployments much easier, requiring that you only touch a file called "restart.txt" in your tmp directory.
|
||||
Nginx is a web server, similar in purpose to Apache. Its event-oriented
|
||||
architecture makes it better at serving static content than Apache, but
|
||||
Danbooru work just as well with Apache if you'd rather use that.
|
||||
|
||||
Installing Passenger on Debian is relatively painless; you can follow the instructions here: http://www.modrails.com/install.html. Passenger will automatically detect Rails folders so the Apache configuration for your site will be basic; the Passenger website explains in detail.
|
||||
Phusion Passenger is essentially mod_rails, a compiled module for Nginx that
|
||||
is similar in functionality to fastcgi. It is used instead of fastcgi or
|
||||
Mongrel to proxy requests between Rails processes that Passenger manages. When
|
||||
used in conjunction with Ruby Enterprise Edition you can see improved
|
||||
performance and memory efficiency. Passenger also makes deployments much
|
||||
easier, requiring that you only touch a file called "restart.txt" in your tmp
|
||||
directory.
|
||||
|
||||
Installing Passenger on Debian is relatively painless; you can follow the
|
||||
instructions here: http://www.modrails.com/install.html. Passenger will
|
||||
automatically detect Rails folders so the Nginx configuration for your site
|
||||
will be basic; the Passenger website explains in detail.
|
||||
|
||||
=== Ruby Enterprise Edition
|
||||
|
||||
REE is a special version of the Ruby interpreter that, among other things, uses a more intelligent malloc routine and performs copy-on-write garbage collection. The end result is better memory usage, up to 30% in ideal cases.
|
||||
REE is a special version of the Ruby interpreter that, among other things,
|
||||
uses a more intelligent malloc routine and performs copy-on-write garbage
|
||||
collection. The end result is better memory usage, up to 30% in ideal cases.
|
||||
|
||||
It is fairly straightforward to install and won't override your existing Ruby installation. Find out more here: http://www.rubyenterpriseedition.com
|
||||
It is fairly straightforward to install and won't override your existing Ruby
|
||||
installation. Find out more here: http://www.rubyenterpriseedition.com
|
||||
|
||||
=== Troubleshooting
|
||||
|
||||
These instructions won't work for everyone. If your setup is not working, here are the steps I usually reccommend to people:
|
||||
These instructions won't work for everyone. If your setup is not working, here
|
||||
are the steps I usually recommend to people:
|
||||
|
||||
1) Test the database. Make sure you can connect to it using psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations.
|
||||
1) Test the database. Make sure you can connect to it using psql. Make sure
|
||||
the tables exist. If this fails, you need to work on correctly installing
|
||||
PostgreSQL, importing the initial schema, and running the migrations.
|
||||
|
||||
2) Test the Rails database connection by using ruby script/console. Run Post.count to make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct.
|
||||
2) Test the Rails database connection by using rails console. Run
|
||||
Post.count to make sure Rails can connect to the database. If this fails, you
|
||||
need to make sure your Danbooru configuration files are correct.
|
||||
|
||||
3) If you're using Mongrel, test connecting directly to the Mongrel process by running elinks http://localhost:PORT. If this fails, you need to debug your Mongrel configuration file.
|
||||
3) If you're using Mongrel, test connecting directly to the Mongrel process by
|
||||
running elinks http://localhost:PORT. If this fails, you need to debug your
|
||||
Mongrel configuration file.
|
||||
|
||||
4) Test Apache to make sure it's proxying requests correctly. If this fails, you need to debug your Apache configuration file.
|
||||
4) Test Nginx to make sure it's proxying requests correctly. If this fails,
|
||||
you need to debug your Nginx configuration file.
|
||||
|
2
app/models/favorite.rb
Normal file
2
app/models/favorite.rb
Normal file
@ -0,0 +1,2 @@
|
||||
class Favorite < ActiveRecord::Base
|
||||
end
|
26
app/models/pool.rb
Normal file
26
app/models/pool.rb
Normal file
@ -0,0 +1,26 @@
|
||||
class Pool < ActiveRecord::Base
|
||||
validates_uniqueness_of :name
|
||||
validates_presence_of :name
|
||||
validates_format_of :name, :with => /\A[^\s;,]+\Z/, :on => :create, :message => "cannot have whitespace, commas, or semicolons"
|
||||
belongs_to :creator, :class_name => "Person"
|
||||
|
||||
def self.create_anonymous(creator)
|
||||
pool = Pool.create(:name => "TEMP - #{Time.now.to_f}.#{rand(1_000_000)}", :creator => creator)
|
||||
pool.update_attribute(:name => "anonymous:#{pool.id}")
|
||||
pool
|
||||
end
|
||||
|
||||
def neighbor_posts(post)
|
||||
post_ids =~ /\A#{post.id} (\d+)|(\d+) #{post.id} (\d+)|(\d+) #{post.id}\Z/
|
||||
|
||||
if $2 && $3
|
||||
{:previous => $2.to_i, :next = $3.to_i}
|
||||
elsif $1
|
||||
{:previous => $1.to_i}
|
||||
elsif $4
|
||||
{:next => $4.to_i}
|
||||
else
|
||||
nil
|
||||
end
|
||||
end
|
||||
end
|
@ -6,6 +6,7 @@ class Post < ActiveRecord::Base
|
||||
after_save :create_version
|
||||
before_save :merge_old_tags
|
||||
before_save :normalize_tags
|
||||
before_save :set_tag_counts
|
||||
has_many :versions, :class_name => "PostVersion"
|
||||
|
||||
module FileMethods
|
||||
@ -133,11 +134,7 @@ class Post < ActiveRecord::Base
|
||||
|
||||
module TagMethods
|
||||
def tag_array(reload = false)
|
||||
if @tag_array.nil? || reload
|
||||
@tag_array = Tag.scan_tags(tag_string)
|
||||
end
|
||||
|
||||
@tag_array
|
||||
Tag.scan_tags(tag_string)
|
||||
end
|
||||
|
||||
def set_tag_counts
|
||||
@ -171,10 +168,10 @@ class Post < ActiveRecord::Base
|
||||
if old_tag_string
|
||||
# If someone else committed changes to this post before we did,
|
||||
# then try to merge the tag changes together.
|
||||
db_tags = Tag.scan_tags(tag_string_was)
|
||||
current_tags = Tag.scan_tags(tag_string_was)
|
||||
new_tags = tag_array()
|
||||
old_tags = Tag.scan_tags(old_tag_string)
|
||||
self.tag_string = (db_tags + (new_tags - old_tags) - (old_tags - new_tags)).uniq.join(" ")
|
||||
self.tag_string = ((current_tags + new_tags) - old_tags + (current_tags & new_tags)).uniq.join(" ")
|
||||
end
|
||||
end
|
||||
|
||||
@ -182,87 +179,24 @@ class Post < ActiveRecord::Base
|
||||
normalized_tags = Tag.scan_tags(tag_string)
|
||||
# normalized_tags = TagAlias.to_aliased(normalized_tags)
|
||||
# normalized_tags = TagImplication.with_implications(normalized_tags)
|
||||
normalized_tags = parse_metatags(normalized_tags)
|
||||
normalized_tags = filter_metatags(normalized_tags)
|
||||
self.tag_string = normalized_tags.uniq.join(" ")
|
||||
end
|
||||
|
||||
def parse_metatags(tags)
|
||||
tags.map do |tag|
|
||||
if tag =~ /^(?:pool|rating|fav|user|uploader):(.+)/
|
||||
case $1
|
||||
when "pool"
|
||||
parse_pool_tag($2)
|
||||
|
||||
when "rating"
|
||||
parse_rating_tag($2)
|
||||
|
||||
when "fav"
|
||||
parse_fav_tag($2)
|
||||
|
||||
when "uploader"
|
||||
# ignore
|
||||
|
||||
when "user"
|
||||
# ignore
|
||||
end
|
||||
|
||||
nil
|
||||
else
|
||||
tag
|
||||
end
|
||||
end.compact
|
||||
end
|
||||
|
||||
def parse_pool_tag(text)
|
||||
case text
|
||||
when "new"
|
||||
pool = Pool.create_anonymous
|
||||
pool.posts << self
|
||||
|
||||
when "recent"
|
||||
raise NotImplementedError
|
||||
|
||||
when /^\d+$/
|
||||
pool = Pool.find_by_id(text.to_i)
|
||||
pool.posts << self if pool
|
||||
|
||||
else
|
||||
pool = Pool.find_by_name(text)
|
||||
pool.posts << self if pool
|
||||
end
|
||||
end
|
||||
|
||||
def parse_rating_tag(rating)
|
||||
case rating
|
||||
when /q/
|
||||
self.rating = "q"
|
||||
|
||||
when /e/
|
||||
self.rating = "e"
|
||||
|
||||
when /s/
|
||||
self.rating = "s"
|
||||
end
|
||||
end
|
||||
|
||||
def parse_fav_tag(text)
|
||||
case text
|
||||
when "add", "new"
|
||||
add_favorite(updater_id)
|
||||
|
||||
when "remove", "rem", "del"
|
||||
remove_favorite(updater_id)
|
||||
end
|
||||
def filter_metatags(tags)
|
||||
tags.reject {|tag| tag =~ /\A(?:pool|rating|fav|approver|uploader):/}
|
||||
end
|
||||
end
|
||||
|
||||
module FavoriteMethods
|
||||
def add_favorite(user_id)
|
||||
self.fav_string += " fav:#{user_id}"
|
||||
def add_favorite(user)
|
||||
self.fav_string += " fav:#{user.name}"
|
||||
self.fav_string.strip!
|
||||
end
|
||||
|
||||
def remove_favorite(user_id)
|
||||
self.fav_string.gsub!(/user:#{user_id}\b\s*/, " ")
|
||||
def remove_favorite(user)
|
||||
self.fav_string.gsub!(/fav:#{user.name}\b\s*/, " ")
|
||||
self.fav_string.strip!
|
||||
end
|
||||
end
|
||||
|
||||
@ -300,6 +234,48 @@ class Post < ActiveRecord::Base
|
||||
"''" + escaped_token + "''"
|
||||
end
|
||||
end
|
||||
|
||||
def add_tag_string_search_relation(tags, relation)
|
||||
tag_query_sql = []
|
||||
|
||||
if tags[:include].any?
|
||||
tag_query_sql << "(" + escape_string_for_tsquery(tags[:include]).join(" | ") + ")"
|
||||
end
|
||||
|
||||
if tags[:related].any?
|
||||
raise SearchError.new("You cannot search for more than #{Danbooru.config.tag_query_limit} tags at a time") if tags[:related].size > Danbooru.config.tag_query_limit
|
||||
tag_query_sql << "(" + escape_string_for_tsquery(tags[:related]).join(" & ") + ")"
|
||||
end
|
||||
|
||||
if tags[:exclude].any?
|
||||
raise SearchError.new("You cannot search for more than #{Danbooru.config.tag_query_limit} tags at a time") if tags[:exclude].size > Danbooru.config.tag_query_limit
|
||||
|
||||
if tags[:related].any? || tags[:include].any?
|
||||
tag_query_sql << "!(" + escape_string_for_tsquery(tags[:exclude]).join(" | ") + ")"
|
||||
else
|
||||
raise SearchError.new("You cannot search for only excluded tags")
|
||||
end
|
||||
end
|
||||
|
||||
if tag_query_sql.any?
|
||||
relation.where("posts.tag_index @@ to_tsquery('danbooru', E'" + tag_query_sql.join(" & ") + "')")
|
||||
end
|
||||
end
|
||||
|
||||
def add_tag_subscription_relation(subscriptions, relation)
|
||||
subscriptions.each do |subscription|
|
||||
subscription =~ /^(.+?):(.+)$/
|
||||
user_name = $1 || subscription
|
||||
subscription_name = $2
|
||||
|
||||
user = User.find_by_name(user_name)
|
||||
|
||||
if user
|
||||
post_ids = TagSubscription.find_post_ids(user.id, subscription_name)
|
||||
relation.where(["posts.id IN (?)", post_ids])
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def build_relation(q, options = {})
|
||||
unless q.is_a?(Hash)
|
||||
@ -326,7 +302,7 @@ class Post < ActiveRecord::Base
|
||||
end
|
||||
|
||||
if q[:md5].is_a?(String)
|
||||
relation.where(["posts.md5 IN (?)", q[:md5].split(/,/)])
|
||||
relation.where(["posts.md5 IN (?)", q[:md5]])
|
||||
end
|
||||
|
||||
if q[:status] == "deleted"
|
||||
@ -343,45 +319,11 @@ class Post < ActiveRecord::Base
|
||||
relation.where(["posts.source LIKE ? ESCAPE E'\\\\", q[:source]])
|
||||
end
|
||||
|
||||
if q[:subscriptions].is_a?(String)
|
||||
raise NotImplementedError
|
||||
|
||||
q[:subscriptions] =~ /^(.+?):(.+)$/
|
||||
username = $1 || q[:subscriptions]
|
||||
subscription_name = $2
|
||||
|
||||
user = User.find_by_name(username)
|
||||
|
||||
if user
|
||||
post_ids = TagSubscription.find_post_ids(user.id, subscription_name)
|
||||
relation.where(["posts.id IN (?)", post_ids])
|
||||
end
|
||||
if q[:subscriptions].any?
|
||||
add_tag_subscription_relation(q[:subscriptions], relation)
|
||||
end
|
||||
|
||||
tag_query_sql = []
|
||||
|
||||
if q[:include].any?
|
||||
tag_query_sql << "(" + escape_string_for_tsquery(q[:include]).join(" | ") + ")"
|
||||
end
|
||||
|
||||
if q[:related].any?
|
||||
raise SearchError.new("You cannot search for more than #{Danbooru.config.tag_query_limit} tags at a time") if q[:related].size > Danbooru.config.tag_query_limit
|
||||
tag_query_sql << "(" + escape_string_for_tsquery(q[:related]).join(" & ") + ")"
|
||||
end
|
||||
|
||||
if q[:exclude].any?
|
||||
raise SearchError.new("You cannot search for more than #{Danbooru.config.tag_query_limit} tags at a time") if q[:exclude].size > Danbooru.config.tag_query_limit
|
||||
|
||||
if q[:related].any? || q[:include].any?
|
||||
tag_query_sql << "!(" + escape_string_for_tsquery(q[:exclude]).join(" | ") + ")"
|
||||
else
|
||||
raise SearchError.new("You cannot search for only excluded tags")
|
||||
end
|
||||
end
|
||||
|
||||
if tag_query_sql.any?
|
||||
relation.where("posts.tag_index @@ to_tsquery('danbooru', E'" + tag_query_sql.join(" & ") + "')")
|
||||
end
|
||||
add_tag_string_search_relation(q[:tags], relation)
|
||||
|
||||
if q[:rating] == "q"
|
||||
relation.where("posts.rating = 'q'")
|
||||
@ -398,7 +340,7 @@ class Post < ActiveRecord::Base
|
||||
elsif q[:rating_negated] == "e"
|
||||
relation.where("posts.rating <> 'e'")
|
||||
end
|
||||
|
||||
|
||||
case q[:order]
|
||||
when "id", "id_asc"
|
||||
relation.order("posts.id")
|
||||
@ -455,19 +397,35 @@ class Post < ActiveRecord::Base
|
||||
|
||||
module UploaderMethods
|
||||
def uploader_id=(user_id)
|
||||
self.uploader_string = "user:#{user_id}"
|
||||
self.uploader = User.find(user_id)
|
||||
end
|
||||
|
||||
def uploader_id
|
||||
uploader_string[5, 100].to_i
|
||||
uploader.id
|
||||
end
|
||||
|
||||
def uploader_name
|
||||
uploader_string[5..-1]
|
||||
end
|
||||
|
||||
def uploader
|
||||
User.find(uploader_id)
|
||||
User.find_by_name(uploader_name)
|
||||
end
|
||||
|
||||
def uploader=(user)
|
||||
self.uploader_id = user.id
|
||||
self.uploader_string = "user:#{usern.name}"
|
||||
end
|
||||
end
|
||||
|
||||
module PoolMethods
|
||||
def add_pool(pool)
|
||||
self.pool_string += " pool:#{pool.name}"
|
||||
self.pool_string.strip!
|
||||
end
|
||||
|
||||
def remove_pool(user_id)
|
||||
self.pool_string.gsub!(/pool:#{pool.name}\b\s*/, " ")
|
||||
self.pool_string.strip!
|
||||
end
|
||||
end
|
||||
|
||||
@ -479,4 +437,5 @@ class Post < ActiveRecord::Base
|
||||
include TagMethods
|
||||
include FavoriteMethods
|
||||
include UploaderMethods
|
||||
include PoolMethods
|
||||
end
|
||||
|
@ -1,6 +1,7 @@
|
||||
class Tag < ActiveRecord::Base
|
||||
attr_accessible :category
|
||||
after_save :update_category_cache
|
||||
named_scope :by_pattern, lambda {|name| where(["name LIKE ? ESCAPE E'\\\\'", name.to_escaped_for_sql_like])}
|
||||
|
||||
class CategoryMapping
|
||||
Danbooru.config.reverse_tag_category_mapping.each do |value, category|
|
||||
@ -30,26 +31,19 @@ class Tag < ActiveRecord::Base
|
||||
@category_mapping ||= CategoryMapping.new
|
||||
end
|
||||
|
||||
def select_category_for(tag_name)
|
||||
select_value_sql("SELECT category FROM tags WHERE name = ?", tag_name).to_i
|
||||
end
|
||||
|
||||
def category_for(tag_name)
|
||||
Cache.get("tc:#{Cache.sanitize(tag_name)}") do
|
||||
select_value_sql("SELECT category FROM tags WHERE name = ?", tag_name).to_i
|
||||
select_category_for(tag_name)
|
||||
end
|
||||
end
|
||||
|
||||
def categories_for(tag_names)
|
||||
key_hash = tag_names.inject({}) do |hash, x|
|
||||
hash[x] = "tc:#{Cache.sanitize(x)}"
|
||||
hash
|
||||
end
|
||||
categories_hash = MEMCACHE.get_multi(key_hash.values)
|
||||
returning({}) do |result_hash|
|
||||
key_hash.each do |tag_name, hash_key|
|
||||
if categories_hash.has_key?(hash_key)
|
||||
result_hash[tag_name] = categories_hash[hash_key]
|
||||
else
|
||||
result_hash[tag_name] = category_for(tag_name)
|
||||
end
|
||||
end
|
||||
Cache.get_multi(tag_names, "tc") do |name|
|
||||
select_category_for(name)
|
||||
end
|
||||
end
|
||||
end
|
||||
@ -191,63 +185,110 @@ class Tag < ActiveRecord::Base
|
||||
|
||||
end
|
||||
end
|
||||
|
||||
def parse_tag(tag, output)
|
||||
if tag[0] == "-" && tag.size > 1
|
||||
output[:exclude] << tag[1..-1]
|
||||
|
||||
elsif tag =~ /\*/
|
||||
matches = Tag.by_pattern(tag).all(:select => "name", :limit => 25, :order => "post_count DESC").map(&:name)
|
||||
matches = ["~no_matches~"] if matches.empty?
|
||||
output[:include] += matches
|
||||
|
||||
else
|
||||
output[:related] << token
|
||||
end
|
||||
end
|
||||
|
||||
def parse_query(query, options = {})
|
||||
q = Hash.new {|h, k| h[k] = []}
|
||||
q[:tags] = {
|
||||
:related => [],
|
||||
:include => [],
|
||||
:exclude => []
|
||||
}
|
||||
|
||||
scan_query(query).each do |token|
|
||||
if token =~ /\A(sub|md5|-rating|rating|width|height|mpixels|score|filesize|source|id|date|order|change|status|tagcount|gentagcount|arttagcount|chartagcount|copytagcount):(.+)\Z/
|
||||
if $1 == "sub"
|
||||
q[:subscriptions] = $2
|
||||
elsif $1 == "md5"
|
||||
q[:md5] = $2
|
||||
elsif $1 == "-rating"
|
||||
if token =~ /\A(-uploader|uploader|-pool|pool|-fav|fav|sub|md5|-rating|rating|width|height|mpixels|score|filesize|source|id|date|order|status|tagcount|gentags|arttags|chartags|copytags):(.+)\Z/
|
||||
case $1
|
||||
when "-uploader"
|
||||
q[:tags][:exclude] << token[1..-1]
|
||||
|
||||
when "uploader"
|
||||
q[:tags][:related] << token
|
||||
|
||||
when "-pool"
|
||||
q[:tags][:exclude] << token[1..-1]
|
||||
|
||||
when "pool"
|
||||
q[:tags][:related] << token
|
||||
|
||||
when "-fav"
|
||||
q[:tags][:exclude] << token[1..-1]
|
||||
|
||||
when "fav"
|
||||
q[:tags][:related] << token
|
||||
|
||||
when "sub"
|
||||
q[:subscriptions] << $2
|
||||
|
||||
when "md5"
|
||||
q[:md5] = $2.split(/,/)
|
||||
|
||||
when "-rating"
|
||||
q[:rating_negated] = $2
|
||||
elsif $1 == "rating"
|
||||
|
||||
when "rating"
|
||||
q[:rating] = $2
|
||||
elsif $1 == "id"
|
||||
|
||||
when "id"
|
||||
q[:post_id] = parse_helper($2)
|
||||
elsif $1 == "width"
|
||||
|
||||
when "width"
|
||||
q[:width] = parse_helper($2)
|
||||
elsif $1 == "height"
|
||||
|
||||
when "height"
|
||||
q[:height] = parse_helper($2)
|
||||
elsif $1 == "mpixels"
|
||||
|
||||
when "mpixels"
|
||||
q[:mpixels] = parse_helper($2, :float)
|
||||
elsif $1 == "score"
|
||||
|
||||
when "score"
|
||||
q[:score] = parse_helper($2)
|
||||
elsif $1 == "filesize"
|
||||
|
||||
when "filesize"
|
||||
q[:filesize] = parse_helper($2, :filesize)
|
||||
elsif $1 == "source"
|
||||
|
||||
when "source"
|
||||
q[:source] = $2.to_escaped_for_sql_like + "%"
|
||||
elsif $1 == "date"
|
||||
|
||||
when "date"
|
||||
q[:date] = parse_helper($2, :date)
|
||||
elsif $1 == "tagcount"
|
||||
|
||||
when "tagcount"
|
||||
q[:tag_count] = parse_helper($2)
|
||||
elsif $1 == "gentagcount"
|
||||
|
||||
when "gentags"
|
||||
q[:general_tag_count] = parse_helper($2)
|
||||
elsif $1 == "arttagcount"
|
||||
|
||||
when "arttags"
|
||||
q[:artist_tag_count] = parse_helper($2)
|
||||
elsif $1 == "chartagcount"
|
||||
|
||||
when "chartags"
|
||||
q[:character_tag_count] = parse_helper($2)
|
||||
elsif $1 == "copytagcount"
|
||||
|
||||
when "copytags"
|
||||
q[:copyright_tag_count] = parse_helper($2)
|
||||
elsif $1 == "order"
|
||||
|
||||
when "order"
|
||||
q[:order] = $2
|
||||
elsif $1 == "change"
|
||||
q[:change] = parse_helper($2)
|
||||
elsif $1 == "status"
|
||||
|
||||
when "status"
|
||||
q[:status] = $2
|
||||
end
|
||||
elsif token[0] == "-" && token.size > 1
|
||||
q[:exclude] << token[1..-1]
|
||||
elsif token[0] == "~" && token.size > 1
|
||||
q[:include] << token[1..-1]
|
||||
elsif token.include?("*")
|
||||
matches = where(["name LIKE ? ESCAPE E'\\\\'", token.to_escaped_for_sql_like]).all(:select => "name", :limit => 25, :order => "post_count DESC").map(&:name)
|
||||
matches = ["~no_matches~"] if matches.empty?
|
||||
q[:include] += matches
|
||||
|
||||
else
|
||||
q[:related] << token
|
||||
parse_tag(token, q[:tags])
|
||||
end
|
||||
end
|
||||
|
||||
@ -257,9 +298,9 @@ class Tag < ActiveRecord::Base
|
||||
end
|
||||
|
||||
def normalize_tags_in_query(query_hash)
|
||||
query_hash[:exclude] = TagAlias.to_aliased(query_hash[:exclude], :strip_prefix => true) if query_hash.has_key?(:exclude)
|
||||
query_hash[:include] = TagAlias.to_aliased(query_hash[:include], :strip_prefix => true) if query_hash.has_key?(:include)
|
||||
query_hash[:related] = TagAlias.to_aliased(query_hash[:related]) if query_hash.has_key?(:related)
|
||||
query_hash[:tags][:exclude] = TagAlias.to_aliased(query_hash[:tags][:exclude])
|
||||
query_hash[:tags][:include] = TagAlias.to_aliased(query_hash[:tags][:include])
|
||||
query_hash[:tags][:related] = TagAlias.to_aliased(query_hash[:tags][:related])
|
||||
end
|
||||
end
|
||||
|
||||
|
17
app/models/tag_alias.rb
Normal file
17
app/models/tag_alias.rb
Normal file
@ -0,0 +1,17 @@
|
||||
class TagAlias < ActiveRecord::Base
|
||||
after_save :update_posts
|
||||
|
||||
def self.to_aliased(names)
|
||||
alias_hash = Cache.get_multi(names, "ta") do |name|
|
||||
ta = TagAlias.find_by_antecedent_name(name)
|
||||
if ta
|
||||
ta.consequent_name
|
||||
else
|
||||
name
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def update_posts
|
||||
end
|
||||
end
|
17
app/models/tag_implication.rb
Normal file
17
app/models/tag_implication.rb
Normal file
@ -0,0 +1,17 @@
|
||||
class TagImplication < ActiveRecord::Base
|
||||
def after_save :update_descendant_names
|
||||
|
||||
def descendants
|
||||
all = []
|
||||
children = [consequent_name]
|
||||
|
||||
until children.empty?
|
||||
all += children
|
||||
children = where(["antecedent_name IN (?)", children]).all.map(&:consequent_name)
|
||||
end
|
||||
end
|
||||
|
||||
def update_desecendant_names
|
||||
self.descendant_names = descendants.join(" ")
|
||||
end
|
||||
end
|
@ -113,6 +113,76 @@ CREATE SEQUENCE pending_posts_id_seq
|
||||
ALTER SEQUENCE pending_posts_id_seq OWNED BY pending_posts.id;
|
||||
|
||||
|
||||
--
|
||||
-- Name: pool_versions; Type: TABLE; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
CREATE TABLE pool_versions (
|
||||
id integer NOT NULL,
|
||||
pool_id integer,
|
||||
post_ids text DEFAULT ''::text NOT NULL,
|
||||
updater_id integer NOT NULL,
|
||||
updater_ip_addr inet NOT NULL,
|
||||
created_at timestamp without time zone,
|
||||
updated_at timestamp without time zone
|
||||
);
|
||||
|
||||
|
||||
--
|
||||
-- Name: pool_versions_id_seq; Type: SEQUENCE; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
CREATE SEQUENCE pool_versions_id_seq
|
||||
START WITH 1
|
||||
INCREMENT BY 1
|
||||
NO MAXVALUE
|
||||
NO MINVALUE
|
||||
CACHE 1;
|
||||
|
||||
|
||||
--
|
||||
-- Name: pool_versions_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
ALTER SEQUENCE pool_versions_id_seq OWNED BY pool_versions.id;
|
||||
|
||||
|
||||
--
|
||||
-- Name: pools; Type: TABLE; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
CREATE TABLE pools (
|
||||
id integer NOT NULL,
|
||||
name character varying(255),
|
||||
creator_id integer NOT NULL,
|
||||
description text,
|
||||
is_public boolean DEFAULT true NOT NULL,
|
||||
is_active boolean DEFAULT true NOT NULL,
|
||||
post_ids text DEFAULT ''::text NOT NULL,
|
||||
created_at timestamp without time zone,
|
||||
updated_at timestamp without time zone
|
||||
);
|
||||
|
||||
|
||||
--
|
||||
-- Name: pools_id_seq; Type: SEQUENCE; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
CREATE SEQUENCE pools_id_seq
|
||||
START WITH 1
|
||||
INCREMENT BY 1
|
||||
NO MAXVALUE
|
||||
NO MINVALUE
|
||||
CACHE 1;
|
||||
|
||||
|
||||
--
|
||||
-- Name: pools_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
ALTER SEQUENCE pools_id_seq OWNED BY pools.id;
|
||||
|
||||
|
||||
--
|
||||
-- Name: post_versions; Type: TABLE; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
@ -174,7 +244,7 @@ CREATE TABLE posts (
|
||||
view_count integer DEFAULT 0 NOT NULL,
|
||||
last_noted_at timestamp without time zone,
|
||||
last_commented_at timestamp without time zone,
|
||||
tag_string text NOT NULL,
|
||||
tag_string text DEFAULT ''::text NOT NULL,
|
||||
tag_index tsvector,
|
||||
tag_count integer DEFAULT 0 NOT NULL,
|
||||
tag_count_general integer DEFAULT 0 NOT NULL,
|
||||
@ -341,6 +411,20 @@ ALTER SEQUENCE users_id_seq OWNED BY users.id;
|
||||
ALTER TABLE pending_posts ALTER COLUMN id SET DEFAULT nextval('pending_posts_id_seq'::regclass);
|
||||
|
||||
|
||||
--
|
||||
-- Name: id; Type: DEFAULT; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
ALTER TABLE pool_versions ALTER COLUMN id SET DEFAULT nextval('pool_versions_id_seq'::regclass);
|
||||
|
||||
|
||||
--
|
||||
-- Name: id; Type: DEFAULT; Schema: public; Owner: -
|
||||
--
|
||||
|
||||
ALTER TABLE pools ALTER COLUMN id SET DEFAULT nextval('pools_id_seq'::regclass);
|
||||
|
||||
|
||||
--
|
||||
-- Name: id; Type: DEFAULT; Schema: public; Owner: -
|
||||
--
|
||||
@ -384,6 +468,22 @@ ALTER TABLE ONLY pending_posts
|
||||
ADD CONSTRAINT pending_posts_pkey PRIMARY KEY (id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: pool_versions_pkey; Type: CONSTRAINT; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
ALTER TABLE ONLY pool_versions
|
||||
ADD CONSTRAINT pool_versions_pkey PRIMARY KEY (id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: pools_pkey; Type: CONSTRAINT; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
ALTER TABLE ONLY pools
|
||||
ADD CONSTRAINT pools_pkey PRIMARY KEY (id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: post_versions_pkey; Type: CONSTRAINT; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
@ -424,6 +524,27 @@ ALTER TABLE ONLY users
|
||||
ADD CONSTRAINT users_pkey PRIMARY KEY (id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: index_pool_versions_on_pool_id; Type: INDEX; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
CREATE INDEX index_pool_versions_on_pool_id ON pool_versions USING btree (pool_id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: index_pools_on_creator_id; Type: INDEX; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
CREATE INDEX index_pools_on_creator_id ON pools USING btree (creator_id);
|
||||
|
||||
|
||||
--
|
||||
-- Name: index_pools_on_name; Type: INDEX; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
|
||||
CREATE INDEX index_pools_on_name ON pools USING btree (name);
|
||||
|
||||
|
||||
--
|
||||
-- Name: index_post_versions_on_post_id; Type: INDEX; Schema: public; Owner: -; Tablespace:
|
||||
--
|
||||
@ -574,4 +695,6 @@ INSERT INTO schema_migrations (version) VALUES ('20100205163027');
|
||||
|
||||
INSERT INTO schema_migrations (version) VALUES ('20100205224030');
|
||||
|
||||
INSERT INTO schema_migrations (version) VALUES ('20100209201251');
|
||||
INSERT INTO schema_migrations (version) VALUES ('20100209201251');
|
||||
|
||||
INSERT INTO schema_migrations (version) VALUES ('20100211025616');
|
31
db/migrate/20100211025616_create_pools.rb
Normal file
31
db/migrate/20100211025616_create_pools.rb
Normal file
@ -0,0 +1,31 @@
|
||||
class CreatePools < ActiveRecord::Migration
|
||||
def self.up
|
||||
create_table :pools do |t|
|
||||
t.column :name, :string
|
||||
t.column :creator_id, :integer, :null => false
|
||||
t.column :description, :text
|
||||
t.column :is_public, :boolean, :null => false, :default => true
|
||||
t.column :is_active, :boolean, :null => false, :default => true
|
||||
t.column :post_ids, :text, :null => false, :default => ""
|
||||
t.timestamps
|
||||
end
|
||||
|
||||
add_index :pools, :name
|
||||
add_index :pools, :creator_id
|
||||
|
||||
|
||||
create_table :pool_versions do |t|
|
||||
t.column :pool_id, :integer
|
||||
t.column :post_ids, :text, :null => false, :default => ""
|
||||
t.column :updater_id, :integer, :null => false
|
||||
t.column :updater_ip_addr, "inet", :null => false
|
||||
t.timestamps
|
||||
end
|
||||
|
||||
add_index :pool_versions, :pool_id
|
||||
end
|
||||
|
||||
def self.down
|
||||
drop_table :pools
|
||||
end
|
||||
end
|
19
db/migrate/20100211181944_create_favorites.rb
Normal file
19
db/migrate/20100211181944_create_favorites.rb
Normal file
@ -0,0 +1,19 @@
|
||||
class CreateFavorites < ActiveRecord::Migration
|
||||
def self.up
|
||||
(0..9).each do |number|
|
||||
create_table "favorites_#{number}" do |t|
|
||||
t.column :post_id, :integer
|
||||
t.column :user_id, :integer
|
||||
end
|
||||
|
||||
add_index "favorites_#{number}", :post_id
|
||||
add_index "favorites_#{number}", :user_id
|
||||
end
|
||||
end
|
||||
|
||||
def self.down
|
||||
(0..9).each do |number|
|
||||
drop_table "favorites_#{number}"
|
||||
end
|
||||
end
|
||||
end
|
17
db/migrate/20100211191709_create_tag_aliases.rb
Normal file
17
db/migrate/20100211191709_create_tag_aliases.rb
Normal file
@ -0,0 +1,17 @@
|
||||
class CreateTagAliases < ActiveRecord::Migration
|
||||
def self.up
|
||||
create_table :tag_aliases do |t|
|
||||
t.column :antecedent_name, :string, :null => false
|
||||
t.column :consequent_name, :string, :null => false
|
||||
t.column :creator_id, :integer, :null => false
|
||||
t.column :request_ids, :string
|
||||
t.timestamps
|
||||
end
|
||||
|
||||
add_index :tag_aliases, :antecedent_name
|
||||
end
|
||||
|
||||
def self.down
|
||||
drop_table :tag_aliases
|
||||
end
|
||||
end
|
18
db/migrate/20100211191716_create_tag_implications.rb
Normal file
18
db/migrate/20100211191716_create_tag_implications.rb
Normal file
@ -0,0 +1,18 @@
|
||||
class CreateTagImplications < ActiveRecord::Migration
|
||||
def self.up
|
||||
create_table :tag_implications do |t|
|
||||
t.column :antecedent_name, :string, :null => false
|
||||
t.column :consequent_name, :string, :null => false
|
||||
t.column :descendant_names, :text, :null => false
|
||||
t.column :creator_id, :integer, :null => false
|
||||
t.column :request_ids, :string
|
||||
t.timestamps
|
||||
end
|
||||
|
||||
add_index :tag_implications, :antecedent_name
|
||||
end
|
||||
|
||||
def self.down
|
||||
drop_table :tag_implications
|
||||
end
|
||||
end
|
@ -1,55 +0,0 @@
|
||||
=== Installation
|
||||
|
||||
It is recommended that you install Danbooru on a Debian-based system (Lenny or newer) since most of the required packages are available on APT. Although Danbooru has been successfully installed on Fedora, CentOS, FreeBSD, and OS X, the following instructions will assume you're installing on Debian. The Debian install script is straightforward and should be simple to adapt for other platforms. Install docs for other platforms are provided, but these are user contributed and may not be up to date. If you want something similar to Danbooru that is easier to install, try Shimmie (http://trac.shishnet.org/shimmie2). Shimmie uses PHP and MySQL and should be straightforward to install on most hosts.
|
||||
|
||||
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger; the main Danbooru database takes up around 1GB of memory by itself.
|
||||
|
||||
- Danbooru has the following general dependencies: gcc, g++, make, readline, zlib, flex, bison, gd2, bzip2, postgresql-8.4, postgresql-contrib-8.4, ruby, rubygems, memcached, subversion, apache, and phusion passenger. Use your operating system's package management system whenever possible. This will simplify the process of installing init scripts, which will not always happen when compiling from source.
|
||||
- Please read the section below about PostgreSQL and test_parser before proceeding.
|
||||
- Danbooru has the following Ruby gem dependencies: pg, diff-lcs, html5, memcache-client, aws-s3, json, rails (version 3.0)
|
||||
- You may need to specify the path to your PostgreSQL libraries and includes when building the postgres gem. The general format for this is: "gem install postgres -- --with-pgsql-dir=/usr/local/pgsql". Experiment with the other configure settings if this doesn't work.
|
||||
- It's recommended you create a dedicated account for running the Danbooru database and/or web processes. If you go this route:
|
||||
- Use the createuser command while logged in as postgres to grant database access to the danbooru account.
|
||||
- You will need to update the pg_hba.conf file to grant your danbooru account trusted localhost access. Make sure to restart the database server (/etc/init.d/postgresql-8.3 restart) after making any changes.
|
||||
- You now have to check out the Danbooru source code. It's recommended you create it in the /var/www directory, but you can put the code anywhere.
|
||||
- To export from Subversion: "svn export svn://donmai.us/danbooru/trunk danbooru"
|
||||
- Recursively change the owner of this directory to the danbooru account: "chown -R danbooru:danbooru danbooru"
|
||||
- Create a public/data/sample directory.
|
||||
- Compile the resizer at lib/danbooru_image_resizer: "ruby extconf.rb && make". Do not make install it. If this fails you will need to figure out your gd2/libjpeg/libpng dependencies.
|
||||
- Create new database.yml and local_config.rb files in the config directory. Example files are provided.
|
||||
- Create the database: "createdb danbooru"
|
||||
- Load the schema: "psql danbooru < db/postgres.sql"
|
||||
- Run the migrations: "RAILS_ENV=production rake db:migrate"
|
||||
- Start the job daemon: "RAILS_ENV=production app/daemons/job_task_processor_ctl.rb start"
|
||||
- You now need a way of managing the Rails process. The preferred method is using the Phusion Passenger module (see section below). Alternatively you can use Mongrel or fastcgi, there are several examples on the web.
|
||||
- You should now be able to connect to your Danbooru instance. The first account you create will automatically become the administrator, so you should do this first.
|
||||
|
||||
=== PostgreSQL and test_parser
|
||||
|
||||
Starting with version 1.16, Danbooru relies on PostgreSQL's full text search feature to speed up tag queries. The gains are especially noticeable on tags with large post counts and for multi-tag joins. Unfortunately in order to adapt it for Danbooru a custom parser is required.
|
||||
|
||||
The easiest way of doing this on Debian is installing the the postgresql-contrib-8.4 package. You should do this prior to running the Danbooru database migrations.
|
||||
|
||||
=== Apache and Phusion Passenger
|
||||
|
||||
Phusion Passenger is essentially mod_rails, a compiled module for Apache that is similar in functionality to fastcgi. It is used instead of fastcgi or Mongrel to proxy requests between Rails processes that Passenger manages. When used in conjunction with Ruby Enterprise Edition you can see improved performance and memory efficiency. Passenger also makes deployments much easier, requiring that you only touch a file called "restart.txt" in your tmp directory.
|
||||
|
||||
Installing Passenger on Debian is relatively painless; you can follow the instructions here: http://www.modrails.com/install.html. Passenger will automatically detect Rails folders so the Apache configuration for your site will be basic; the Passenger website explains in detail.
|
||||
|
||||
=== Ruby Enterprise Edition
|
||||
|
||||
REE is a special version of the Ruby interpreter that, among other things, uses a more intelligent malloc routine and performs copy-on-write garbage collection. The end result is better memory usage, up to 30% in ideal cases.
|
||||
|
||||
It is fairly straightforward to install and won't override your existing Ruby installation. Find out more here: http://www.rubyenterpriseedition.com
|
||||
|
||||
=== Troubleshooting
|
||||
|
||||
These instructions won't work for everyone. If your setup is not working, here are the steps I usually reccommend to people:
|
||||
|
||||
1) Test the database. Make sure you can connect to it using psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations.
|
||||
|
||||
2) Test the Rails database connection by using ruby script/console. Run Post.count to make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct.
|
||||
|
||||
3) If you're using Mongrel, test connecting directly to the Mongrel process by running elinks http://localhost:PORT. If this fails, you need to debug your Mongrel configuration file.
|
||||
|
||||
4) Test Apache to make sure it's proxying requests correctly. If this fails, you need to debug your Apache configuration file.
|
19
lib/cache.rb
19
lib/cache.rb
@ -20,6 +20,24 @@ module Cache
|
||||
ActiveRecord::Base.logger.debug('MemCache Incr %s' % [key])
|
||||
end
|
||||
|
||||
def get_multi(keys, prefix, expiry = 0)
|
||||
key_to_sanitized_key_hash = keys.inject({}) do |hash, x|
|
||||
hash[x] = "#{prefix}:#{Cache.sanitize(x)}"
|
||||
hash
|
||||
end
|
||||
sanitized_key_to_value_hash = MEMCACHE.get_multi(key_to_sanitized_key_hash.values)
|
||||
returning({}) do |result_hash|
|
||||
key_to_sanitized_key_hash.each do |key, sanitized_key|
|
||||
if sanitized_key_to_value_hash.has_key?(sanitized_key)
|
||||
result_hash[key] = sanitized_key_to_value_hash[sanitized_key]
|
||||
else
|
||||
result_hash[key] = yield(key)
|
||||
Cache.put(sanitized_key, result_hash[key], expiry)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def get(key, expiry = 0)
|
||||
if block_given?
|
||||
return yield
|
||||
@ -81,6 +99,7 @@ module Cache
|
||||
end
|
||||
|
||||
module_function :get
|
||||
module_function :get_multi
|
||||
module_function :expire
|
||||
module_function :incr
|
||||
module_function :put
|
||||
|
8
test/unit/favorite_test.rb
Normal file
8
test/unit/favorite_test.rb
Normal file
@ -0,0 +1,8 @@
|
||||
require 'test_helper'
|
||||
|
||||
class FavoriteTest < ActiveSupport::TestCase
|
||||
# Replace this with your real tests.
|
||||
test "the truth" do
|
||||
assert true
|
||||
end
|
||||
end
|
8
test/unit/pool_test.rb
Normal file
8
test/unit/pool_test.rb
Normal file
@ -0,0 +1,8 @@
|
||||
require 'test_helper'
|
||||
|
||||
class PoolTest < ActiveSupport::TestCase
|
||||
# Replace this with your real tests.
|
||||
test "the truth" do
|
||||
assert true
|
||||
end
|
||||
end
|
@ -34,4 +34,133 @@ class PostTest < ActiveSupport::TestCase
|
||||
assert(!@post.is_deleted?, "Post should not be deleted.")
|
||||
end
|
||||
end
|
||||
|
||||
context "A post version" do
|
||||
should "be created on any save" do
|
||||
@user = Factory.create(:user)
|
||||
@post = Factory.create(:post)
|
||||
assert_equal(1, @post.versions.size)
|
||||
|
||||
@post.update_attributes(:rating => "e", :updater_id => @user.id, :updater_ip_addr => "125.0.0.0")
|
||||
assert_equal(2, @post.versions.size)
|
||||
assert_equal(@user.id, @post.versions.last.updater_id)
|
||||
assert_equal("125.0.0.0", @post.versions.last.updater_ip_addr)
|
||||
end
|
||||
end
|
||||
|
||||
context "A post's tags" do
|
||||
setup do
|
||||
@post = Factory.create(:post)
|
||||
end
|
||||
|
||||
should "have an array representation" do
|
||||
assert_equal(%w(tag1 tag2), @post.tag_array)
|
||||
end
|
||||
|
||||
should "be counted" do
|
||||
@user = Factory.create(:user)
|
||||
@artist_tag = Factory.create(:artist_tag)
|
||||
@copyright_tag = Factory.create(:copyright_tag)
|
||||
@general_tag = Factory.create(:tag)
|
||||
@new_post = Factory.create(:post, :tag_string => "#{@artist_tag.name} #{@copyright_tag.name} #{@general_tag.name}")
|
||||
assert_equal(1, @new_post.tag_count_artist)
|
||||
assert_equal(1, @new_post.tag_count_copyright)
|
||||
assert_equal(1, @new_post.tag_count_general)
|
||||
assert_equal(0, @new_post.tag_count_character)
|
||||
assert_equal(3, @new_post.tag_count)
|
||||
|
||||
@new_post.update_attributes(:tag_string => "babs", :updater_id => @user.id, :updater_ip_addr => "127.0.0.1")
|
||||
assert_equal(0, @new_post.tag_count_artist)
|
||||
assert_equal(0, @new_post.tag_count_copyright)
|
||||
assert_equal(1, @new_post.tag_count_general)
|
||||
assert_equal(0, @new_post.tag_count_character)
|
||||
assert_equal(1, @new_post.tag_count)
|
||||
end
|
||||
|
||||
should "be merged with any changes that were made after loading the initial set of tags part 1" do
|
||||
@user = Factory.create(:user)
|
||||
@post = Factory.create(:post, :tag_string => "aaa bbb ccc")
|
||||
|
||||
# user a adds <ddd>
|
||||
@post_edited_by_user_a = Post.find(@post.id)
|
||||
@post_edited_by_user_a.update_attributes(
|
||||
:updater_id => @user.id,
|
||||
:updater_ip_addr => "127.0.0.1",
|
||||
:old_tag_string => "aaa bbb ccc",
|
||||
:tag_string => "aaa bbb ccc ddd"
|
||||
)
|
||||
|
||||
# user b removes <ccc> adds <eee>
|
||||
@post_edited_by_user_b = Post.find(@post.id)
|
||||
@post_edited_by_user_b.update_attributes(
|
||||
:updater_id => @user.id,
|
||||
:updater_ip_addr => "127.0.0.1",
|
||||
:old_tag_string => "aaa bbb ccc",
|
||||
:tag_string => "aaa bbb eee"
|
||||
)
|
||||
|
||||
# final should be <aaa>, <bbb>, <ddd>, <eee>
|
||||
@final_post = Post.find(@post.id)
|
||||
assert_equal(%w(aaa bbb ddd eee), Tag.scan_tags(@final_post.tag_string).sort)
|
||||
end
|
||||
|
||||
should "be merged with any changes that were made after loading the initial set of tags part 2" do
|
||||
# This is the same as part 1, only the order of operations is reversed.
|
||||
# The results should be the same.
|
||||
|
||||
@user = Factory.create(:user)
|
||||
@post = Factory.create(:post, :tag_string => "aaa bbb ccc")
|
||||
|
||||
# user a removes <ccc> adds <eee>
|
||||
@post_edited_by_user_a = Post.find(@post.id)
|
||||
@post_edited_by_user_a.update_attributes(
|
||||
:updater_id => @user.id,
|
||||
:updater_ip_addr => "127.0.0.1",
|
||||
:old_tag_string => "aaa bbb ccc",
|
||||
:tag_string => "aaa bbb eee"
|
||||
)
|
||||
|
||||
# user b adds <ddd>
|
||||
@post_edited_by_user_b = Post.find(@post.id)
|
||||
@post_edited_by_user_b.update_attributes(
|
||||
:updater_id => @user.id,
|
||||
:updater_ip_addr => "127.0.0.1",
|
||||
:old_tag_string => "aaa bbb ccc",
|
||||
:tag_string => "aaa bbb ccc ddd"
|
||||
)
|
||||
|
||||
# final should be <aaa>, <bbb>, <ddd>, <eee>
|
||||
@final_post = Post.find(@post.id)
|
||||
assert_equal(%w(aaa bbb ddd eee), Tag.scan_tags(@final_post.tag_string).sort)
|
||||
end
|
||||
end
|
||||
|
||||
context "Adding a meta-tag" do
|
||||
setup do
|
||||
@post = Factory.create(:post)
|
||||
end
|
||||
|
||||
should "be ignored" do
|
||||
@user = Factory.create(:user)
|
||||
|
||||
@post.update_attributes(
|
||||
:updater_id => @user.id,
|
||||
:updater_ip_addr => "127.0.0.1",
|
||||
:tag_string => "aaa pool:1234 pool:test rating:s fav:bob"
|
||||
)
|
||||
assert_equal("aaa", @post.tag_string)
|
||||
end
|
||||
end
|
||||
|
||||
context "Favoriting a post" do
|
||||
should "update the favorite string" do
|
||||
@user = Factory.create(:user)
|
||||
@post = Factory.create(:post)
|
||||
@post.add_favorite(@user.id)
|
||||
assert_equal("fav:#{@user.id}", @post.fav_string)
|
||||
|
||||
@post.remove_favorite(@user.id)
|
||||
assert_equal("", @post.fav_string)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
8
test/unit/tag_alias_test.rb
Normal file
8
test/unit/tag_alias_test.rb
Normal file
@ -0,0 +1,8 @@
|
||||
require 'test_helper'
|
||||
|
||||
class TagAliasTest < ActiveSupport::TestCase
|
||||
# Replace this with your real tests.
|
||||
test "the truth" do
|
||||
assert true
|
||||
end
|
||||
end
|
8
test/unit/tag_implication_test.rb
Normal file
8
test/unit/tag_implication_test.rb
Normal file
@ -0,0 +1,8 @@
|
||||
require 'test_helper'
|
||||
|
||||
class TagImplicationTest < ActiveSupport::TestCase
|
||||
# Replace this with your real tests.
|
||||
test "the truth" do
|
||||
assert true
|
||||
end
|
||||
end
|
@ -108,15 +108,15 @@ class TagTest < ActiveSupport::TestCase
|
||||
tag1 = Factory.create(:tag, :name => "abc")
|
||||
tag2 = Factory.create(:tag, :name => "acb")
|
||||
|
||||
assert_equal({md5: "abc"}, Tag.parse_query("md5:abc"))
|
||||
assert_equal({:post_id => [:between, 1, 2]}, Tag.parse_query("id:1..2"))
|
||||
assert_equal({:post_id => [:gte, 1]}, Tag.parse_query("id:1.."))
|
||||
assert_equal({:post_id => [:lte, 2]}, Tag.parse_query("id:..2"))
|
||||
assert_equal({:post_id => [:gt, 2]}, Tag.parse_query("id:>2"))
|
||||
assert_equal({:post_id => [:lt, 3]}, Tag.parse_query("id:<3"))
|
||||
assert_equal(["abc"], Tag.parse_query("md5:abc")[:md5])
|
||||
assert_equal([:between, 1, 2], Tag.parse_query("id:1..2")[:post_id])
|
||||
assert_equal([:gte, 1], Tag.parse_query("id:1..")[:post_id])
|
||||
assert_equal([:lte, 2], Tag.parse_query("id:..2")[:post_id])
|
||||
assert_equal([:gt, 2], Tag.parse_query("id:>2")[:post_id])
|
||||
assert_equal([:lt, 3], Tag.parse_query("id:<3")[:post_id])
|
||||
|
||||
Tag.expects(:normalize_tags_in_query).returns(nil)
|
||||
assert_equal({:include => ["acb"]}, Tag.parse_query("a*b"))
|
||||
assert_equal(["acb"], Tag.parse_query("a*b")[:tags][:include])
|
||||
end
|
||||
end
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user