Chef Solo Jetty Cookbook Attributes - jetty

I'm having an issue where my chef.json attributes in my Vagrantfile seem to be getting ignored/overwritten.
Environment: Mac OS X 10.8 host, Ubuntu 12.04 guest virtualized in VirtualBox 4.18. Using Berkshelf for cookbook dependencies and the Opscode cookbooks for all of the recipes.
The box is spinning up fine, but I'm trying to configure more like it would look if I downloaded Jetty and un-tarred the archive, rather than a bunch of symlinks from /usr/share/jetty to all over the filesystem the way it seems to be defaulting to.
Here's the chef portion of my Vagrantfile:
config.vm.provision :chef_solo do |chef|
chef.json = { :java => {
:install_flavor => "oracle",
:jdk_version => '7',
:oracle => {
:accept_oracle_license_terms => true
}
},
:jetty => {
:port => '8080',
:home => '/opt/jetty',
:config_dir => '/opt/jetty/conf',
:log_dir => '/opt/jetty/log',
:context_dir => '/opt/jetty/context',
:webapp_dir => '/opt/jetty/webapp'
}
}
chef.add_recipe "apt"
chef.add_recipe "mongodb::default"
chef.add_recipe "java"
chef.add_recipe "jetty"
end
Chef seems to be reading the chef.json because I can change Jetty's port in the Vagrantfile.
I've tried to change these attributes in attributes/default.rb of the Jetty cookbook, but that didn't help either.
What am I missing?

If you take a look at the below block in jetty/recipes/default.rb
jetty_pkgs = value_for_platform(
["debian","ubuntu"] => {
"default" => ["jetty","libjetty-extra"]
},
["centos","redhat","fedora"] => {
"default" => ["jetty6","jetty6-jsp-2.1","jetty6-management"]
},
"default" => ["jetty"]
)
jetty_pkgs.each do |pkg|
package pkg do
action :install
end
end
For Debian/Ubuntu, the default recipe uses DEB packages from official repository instead of what you want (download binary from official website, untar it into your preferred location).
Because DEB packages have their own specifications (run dpkg -L jetty to see their files/directories structure), I reckon that's why your attribute overrides in chef.json did not work.
You can enable debugging output to see more information when you run provision again
VAGRANT_LOG=debug vagrant up
NOTE: It's probably better off writing your own cookbook to download the binary and untar set permissions and do other stuff if you want Jetty to be installed the way you like;-)

Related

CloudSQL JDBC Logstash implementation

Question
I need to query CloudSQL from Logstash but can't find any example out there.
Additional Context
I ran the build command for postgres jdbc driver
mvn -P jar-with-dependencies clean package -DskipTests
And provided it as Logstash JDBC driver (tried with dependencies jar too):
input {
jdbc {
jdbc_driver_library => "/Users/gustavollermalylarrain/Documents/proyectos/labs/cloud-sql-jdbc-socket-factory/jdbc/postgres/target/postgres-socket-factory-1.6.4-SNAPSHOT.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql:///people?cloudSqlInstance=cosmic-keep-148903:us-central1:llermaly&socketFactory=com.google.cloud.sql.postgres.SocketFactory&user=postgres&password=postgres"
statement => "SELECT * FROM people;"
jdbc_user => "postgres"
jdbc_password => "postgres"
}
}
output {
stdout {
codec => rubydebug {
}
}
}
I'm having this error:
Error: java.lang.ClassNotFoundException: org.postgresql.Driver. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
I'm missing something?
The steps to query Cloud SQL from Logstash are:
Build the jar driver:
From this repo: https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory. Clone and run mvn -P jar-with-dependencies clean package -DskipTests
Copy files
Copy the jar files from jdbc/postgres/target/ to logstash-core/lib/jars also download the postgres jdbc driver and copy the jar file to logstash-core/lib/jars as well
Configure Logstash
The configuration file will not include the jar's path because will look into the default folder logstash-core/lib/jars where you copied the jar files.
input {
jdbc {
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql:///people?cloudSqlInstance=cosmic-keep-148903:us-central1:llermaly&socketFactory=com.google.cloud.sql.postgres.SocketFactory&user=postgres&password=postgres"
statement => "SELECT * FROM people;"
jdbc_user => "postgres"
jdbc_password => "postgres"
}
}
output {
stdout {
codec => rubydebug {
}
}
}
jdbc user and password are ignored and the ones you provide in the connection string are used instead. For Postgre Cloud SQL connector you can use both Postgre users or IAM accounts.
Note: You need to run this from a Compute Engine to automatically apply the gcp credentials or manually create the env variables

Image has contents that are not what they are reported to be

I am getting the following error while using the paperclip GEM.I have tried uploading JPG/PNG and neither works. It seems like I am getting validation error..any help would be awesome thanks!
Image has contents that are not what they are reported to be
class Listing < ActiveRecord::Base
has_attached_file :image, :styles => { :medium => "200x", :thumb => "100x100>" }, :default_url => "404.jpg"
validates_attachment_content_type :image, :content_type => /\Aimage\/.*\Z/
end
If your are using in windows 7 development mode. U need to manually install file.exe and set the path. Please follow the content in the link
installing file.exe manually.
After installing
Environment
Open config/environments/development.rb
Add the following line: Paperclip.options[:command_path] = 'C:\Program Files (x86)\GnuWin32\bin'
Restart your Rails server
This worked for Windows 8:
1.Download file.exe
2.Test if is well installed by running your cmd and put the following instructions
convert logo: logo.miff then run imdisplay logo.miff
You will get custom logo image,that will pop up on your windows screen.
From here now you can start configuring everything on rails app
Open config/environments/development.rb
Add the following line: Paperclip.options[:command_path] = 'C:\tools\GnuWin32\bin'
If your rails server is currently running,brake the server and then run again rails s.After that you should be ready to go.Upload image on your app.

Deploy jetty module on client with puppet

I want to try to install the following module on my clients using puppet.
https://forge.puppetlabs.com/maestrodev/jetty
So I install the module on my master using:
puppet module install maestrodev-jetty
this seems to have worked
$ puppet module list
/home/puppetmaster/.puppet/modules
├── maestrodev-jetty (v1.1.2)
├── maestrodev-wget (v1.5.6)
└── puppetlabs-stdlib (v4.3.2)
What I want to do next is deploy jetty, set it up and deploy it on my clients.
I made the following manifest for this:
class { 'jetty':
version => "9.0.4.v20130625",
home => "/opt",
user => "jetty",
group => "jetty",
}
exec { 'stanbol-war-download':
command => "wget -0 /opt/jetty/webapps/my.war http://some.url/my.war",
path => "/usr/bin/",
creates => "/opt/jetty/webapps/my.war",
} ->
exec { 'jetty_start':
command => "java -jar /opt/jetty/my.jar jetty.port=8181 -Xmx2048m -XX:MaxPermSize=512M",
cwd => "/opt/jetty",
path => "/usr/bin/",
notify => Service["jetty"],
returns => [0, 254],
}
I have been trying for a while but I can't seem to get it installed and running on my clients without getting any sort of error, syntax or otherwise.

How do I run my custom recipes on AWS OpsWorks?

I've created a GitHub repo for my simple custom recipe:
laravel/
|- recipes/
| - deploy.rb
|- templates/
|- default
| - database.php.erb
I've added the repo to Custom Chef Recipes as https://github.com/minkruben/Laravel-opsworks.git
I've added laravel::deploy to the deploy "cycle".
This is my deploy.rb:
node[:deploy].each do |app_name, deploy|
if deploy[:application] == "platform"
script "set_permissions" do
interpreter "bash"
user "root"
cwd "#{deploy[:deploy_to]}/current/app"
code <<-EOH
chmod -R 777 storage
EOH
end
template "#{deploy[:deploy_to]}/current/app/config/database.php" do
source "database.php.erb"
mode 0660
group deploy[:group]
if platform?("ubuntu")
owner "www-data"
elsif platform?("amazon")
owner "apache"
end
variables(
:host => (deploy[:database][:host] rescue nil),
:user => (deploy[:database][:username] rescue nil),
:password => (deploy[:database][:password] rescue nil),
:db => (deploy[:database][:database] rescue nil)
)
only_if do
File.directory?("#{deploy[:deploy_to]}/current")
end
end
end
end
When I log into the instance by SSH with the ubuntu user, app/storage folder permission isn't changed & app/config/database.php is not populated with database details.
Am I missing some critical step somewhere? there are no errors in the log.
The recipe is clearly recognized and loaded, but doesn't seem to be executed.
With OpsWorks, you have 2 options:
Use one of Amazon's built-in layers, in which case the deployment recipe is provided by Amazon and you can extend Amazon's logic with hooks: http://docs.aws.amazon.com/opsworks/latest/userguide/workingcookbook-extend-hooks.html
Use a custom layer, in which case you are responsible for providing all recipes including deployment: http://docs.aws.amazon.com/opsworks/latest/userguide/create-custom-deploy.html
The logic you have here looks more like a hook than a deployment recipe. Why? Because you are simply modifying an already-deployed app vs. specifying the deployment logic itself. This seems to suggest you are using one of Amazon's built-in layers and that Amazon is providing the deployment recipe for you.
If the above assumption is correct, then you are on path #1. Re-implementing your logic as a hook should do the trick.

puppet exec vagrant plugin install not working

I have successfully installed vagrant-aws on a centos VM, and I am trying to 'puppetize' this task. My relevant puppet code is below:
exec { 'install_aws':
command => '/usr/bin/vagrant plugin install vagrant-aws',
#require => [Exec['install_dependent'], Package['vagrant']],
}
When I provision the machine, it says the Exec[install_aws]/returns: executed successfully, but the plugin is not installed, and I have to run the command manually for it to work. Never seen this behaviour with puppet, can someone help?
exec { 'install_aws':
command => '/usr/bin/sudo /usr/bin/vagrant plugin install vagrant-aws',
require => [Exec['install_dependent'], Package['vagrant']],
}
Fixed the code above. Good point, needed to run the command as superuser. Seems like a silly mistake, thanks for pointing it out ^^.
Instead of using sudo to run that command (as you pointed out in your answer), I would add the user paramater to the exec and run it as root (or any other user with suitable permissions)
exec { 'install_aws':
user => 'root',
command => '/usr/bin/vagrant plugin install vagrant-aws',
require => [Exec['install_dependent'], Package['vagrant']],
}