I am trying to use SimpleSAMLphp in AWS ubuntu instance but for some reason I can make it run correctly. I am using AWS-LoadBalancer for https, I do not know if it affect the configuration.
config.php
$config = array(
'baseurlpath' => 'simplesaml/',
'certdir' => 'cert/',
'loggingdir' => 'log/',
'datadir' => 'data/',
'tempdir' => '/tmp/simplesaml',
'technicalcontact_name' => 'David Pacheco',
'technicalcontact_email' => 'dpacheco#lumstons.com',
'timezone' => 'America/Mexico_City',
'secretsalt' => '6ogT+0kPWJAO6FbKThWcI1spujbVVdmFEsPVRiPKEWw=',
'auth.adminpassword' => 'david',
'admin.protectindexpage' => false,
'admin.protectmetadata' => false,
'admin.checkforupdates' => true,
'trusted.url.domains' => array(),
'trusted.url.regex' => false,
'enable.http_post' => false,
'debug' => array(
'saml' => false,
'backtraces' => true,
'validatexml' => false,
),
'showerrors' => true,
'errorreporting' => true,
'logging.level' => SimpleSAML\Logger::NOTICE,
'logging.handler' => 'syslog',
'logging.facility' => defined('LOG_LOCAL5') ? constant('LOG_LOCAL5') : LOG_USER,
'logging.processname' => 'simplesamlphp',
'logging.logfile' => 'simplesamlphp.log',
'statistics.out' => array(
),
'proxy' => null,
'database.dsn' => 'mysql:host=localhost;dbname=saml',
'database.username' => 'simplesamlphp',
'database.password' => 'secret',
'database.options' => array(),
'database.prefix' => '',
'database.persistent' => false,
'database.slaves' => array(
),
'enable.saml20-idp' => false,
'enable.shib13-idp' => false,
'enable.adfs-idp' => false,
'enable.wsfed-sp' => false,
'enable.authmemcookie' => false,
'default-wsfed-idp' => 'urn:federation:pingfederate:localhost',
'shib13.signresponse' => true,
'session.duration' => 8 * (60 * 60),
'session.datastore.timeout' => (4 * 60 * 60),
'session.state.timeout' => (60 * 60),
'session.cookie.name' => 'SimpleSAMLSessionID',
'session.cookie.lifetime' => 0,
'session.cookie.path' => '/',
'session.cookie.domain' => null,
'session.cookie.secure' => false,
'session.phpsession.cookiename' => 'SimpleSAML',
'session.phpsession.savepath' => null,
'session.phpsession.httponly' => true,
'session.authtoken.cookiename' => 'SimpleSAMLAuthToken',
'session.rememberme.enable' => false,
'session.rememberme.checked' => false,
'session.rememberme.lifetime' => (14 * 86400),
'memcache_store.servers' => array(
array(
array('hostname' => 'localhost'),
),
),
'memcache_store.prefix' => '',
'memcache_store.expires' => 36 * (60 * 60),
'language' => array(
'priorities' => array(
'no' => array('nb', 'nn', 'en', 'se'),
'nb' => array('no', 'nn', 'en', 'se'),
'nn' => array('no', 'nb', 'en', 'se'),
'se' => array('nb', 'no', 'nn', 'en'),
),
),
'language.available' => array(
'en', 'no', 'nn', 'se', 'da', 'de', 'sv', 'fi', 'es', 'ca', 'fr', 'it', 'nl', 'lb',
'cs', 'sl', 'lt', 'hr', 'hu', 'pl', 'pt', 'pt-br', 'tr', 'ja', 'zh', 'zh-tw', 'ru',
'et', 'he', 'id', 'sr', 'lv', 'ro', 'eu', 'el', 'af'
),
'language.rtl' => array('ar', 'dv', 'fa', 'ur', 'he'),
'language.default' => 'en',
'language.parameter.name' => 'language',
'language.parameter.setcookie' => true,
'language.cookie.name' => 'language',
'language.cookie.domain' => null,
'language.cookie.path' => '/',
'language.cookie.secure' => false,
'language.cookie.httponly' => false,
'language.cookie.lifetime' => (60 * 60 * 24 * 900),
'language.i18n.backend' => 'SimpleSAMLphp',
'attributes.extradictionary' => null,
'theme.use' => 'default',
'template.auto_reload' => false,
'production' => true,
'idpdisco.enableremember' => true,
'idpdisco.rememberchecked' => true,
'idpdisco.validate' => true,
'idpdisco.extDiscoveryStorage' => null,
'idpdisco.layout' => 'dropdown',
'authproc.idp' => array(
30 => 'core:LanguageAdaptor',
45 => array(
'class' => 'core:StatisticsWithAttribute',
'attributename' => 'realm',
'type' => 'saml20-idp-SSO',
),
50 => 'core:AttributeLimit',
99 => 'core:LanguageAdaptor',
),
'authproc.sp' => array(
90 => 'core:LanguageAdaptor',
),
'metadata.sources' => array(
array('type' => 'flatfile'),
),
'metadata.sign.enable' => false,
'metadata.sign.privatekey' => null,
'metadata.sign.privatekey_pass' => null,
'metadata.sign.certificate' => null,
'metadata.sign.algorithm' => null,
'store.type' => 'phpsession',
'store.sql.dsn' => 'sqlite:/path/to/sqlitedatabase.sq3',
'store.sql.username' => null,
'store.sql.password' => null,
'store.sql.prefix' => 'SimpleSAMLphp',
'store.redis.host' => 'localhost',
'store.redis.port' => 6379,
'store.redis.prefix' => 'SimpleSAMLphp',
);
Apache 2 site config:
<VirtualHost *:80>
ServerName saml.veptec.mx
DocumentRoot /var/www/html
Alias /simplesaml /var/simplesamlphp/www
<Directory /var/simplesamlphp/www>
Require all granted
</Directory>
</VirtualHost>
The https://saml.dominian.com/simplesaml is redirected to https://saml.dominian.com/simplesaml/module.php/core/frontpage_welcome.php but that file return HTTP ERROR 500, I try to track down the error and I fond out that is problem with the config file.
Any ideas?
I found the answer, the location it was not initializing. It was a bug.
In the ./lib/SimpleSAML/Locale/Localization.php file I only call this method:
$this->setupTranslator();
in the end of the constructor and it works correctly.
using this Hash object
{"foo" => {"bar" => 1, "baz" => 2}, "bla" => [1,2,3]}
I want to produce this array of Hash objects
[
{"foo" => "*", "bla" => [1,2,3]},
{"foo" => {"bar" => "*", "baz" => 2}, "bla" => [1,2,3]},
{"foo" => {"bar" => "1", "baz" => "*"}, "bla" => [1,2,3]},
{"foo" => {"bar" => "*", "baz" => 2}, "bla" => "*"},
]
Where I basically went over each key and changed its value to "*" while preserving the overall structure of the hash and saved the new hash produced in some array.
I have tried many ideas, but most just wont work as I can guess the Array type before, I only know this hash is produced by JSON.parse and then changed into Hash(String, JSON::Any)
My current try at it
hash = {"bar" => {"and" => "2", "br" => "1"}}
arr = [hash, {"bar" => "1"}]
arr.delete(arr.last)
arr.delete(hash)
def changer(hash, arr, original = nil)
original = hash.dup
hash.each do |k, v|
if v.is_a?(Hash)
changer(v, arr, hash)
elsif v.is_a?(Array)
v.each do |a|
if a.is_a?(Hash)
changer(a, arr, hash)
end
end
elsif v.is_a?(String) && original.is_a?(Hash(String, String))
original[k.to_s] = "*"
arr << original
end
end
end
Crystal v0.25.0 implements JSON::Any and YAML::Any without recursive aliases. With that change:
require "json"
hash = JSON.parse(%(
{"foo": {"bar": 1, "baz": 2}, "bla": [1,2,3]}
))
def changer(any : JSON::Any)
result = [JSON::Any.new("*")]
if (hash = any.as_h?)
hash.each do |key, value|
changer(value).each do |s|
result << JSON::Any.new(hash.merge({key => s}))
end
end
end
result
end
puts changer(hash).join("\n")
*
{"foo" => "*", "bla" => [1_i64, 2_i64, 3_i64]}
{"foo" => {"bar" => "*", "baz" => 2_i64}, "bla" => [1_i64, 2_i64, 3_i64]}
{"foo" => {"bar" => 1_i64, "baz" => "*"}, "bla" => [1_i64, 2_i64, 3_i64]}
{"foo" => {"bar" => 1_i64, "baz" => 2_i64}, "bla" => "*"}
I'm trying to create an initializer for a Class that receives a Hash as parameter. The Hash is a {String => Type} hash, and can be nested. I'm getting an error when running this code:
#file: types.cr
class Types
alias Type = Nil |
Bool |
Int32 |
Int64 |
Float64 |
String |
Array(Type) |
Hash(String, Type)
def initialize(#input : Type)
end
end
input = {"a" => {"b" => {"c" => {"c1" => 1, "c2" => 2, "c3" => true}}}}
s = Types.new(input)
Here is the error I get when running the code above:
$ crystal types.cr
Error in types.cr:16: instantiating 'Types:Class#new(Hash(String, Hash(String, Hash(String, Hash(String, Bool | Int32)))))'
s = Types.new(input)
^~~
in types.cr:11: instance variable '#input' of Types must be Types::Type, not Hash(String, Hash(String, Hash(String, Hash(String, Bool | Int32))))
def initialize(#input : Type)
^~~~~~
Is this possible with Crystal? How should I approach this?
Thanks!
You can do this specifying type of each hash:
c = {"c1" => 1, "c2" => 2, "c3" => true} of String => Types::Type
b = {"c" => c} of String => Types::Type
a = {"b" => b} of String => Types::Type
t = Types.new({"a" => a} of String => Types::Type)
pp t # => #<Types:0x103085ec0
# #input=
# {"a" => {"b" => {"c" => {"c1" => 1, "c2" => 2, "c3" => true}}}}>
Another approach is to define and use Hash-like type:
alias Type = Nil |
Bool |
Int32 |
Int64 |
Float64 |
String |
Array(Type) |
Hash(String, Type)
alias TypesHash = Hash(String, Type)
t = TypesHash{
"a" => TypesHash{
"b" => TypesHash{
"c" => TypesHash{
"c1" => 1, "c2" => 2, "c3" => true,
},
},
},
}
t # {"a" => {"b" => {"c" => {"c1" => 1, "c2" => 2, "c3" => true}}}}
t["a"] # {"b" => {"c" => {"c1" => 1, "c2" => 2, "c3" => true}}}
t["a"].as(TypesHash)["b"] # {"c" => {"c1" => 1, "c2" => 2, "c3" => true}}
t["a"].as(TypesHash)["b"].as(TypesHash)["c"] # {"c1" => 1, "c2" => 2, "c3" => true}
So you can pass it to the constructor just like TypesHash object:
class Types
def initialize(#input : TypesHash); end
end
Types.new t
For example:
apolloServer(request => ({
schema: typeDefinitionArray,
graphiql: true,
context: request.session
}))
http://docs.apollostack.com/apollo-server/tools.html
I know => means es6 function with bound this, but what does the () after the => do?
If you skip the () then it will be ambiguous if it's a body of a lambda or an object literal that you want to return. So it's either
apolloServer(request => {
return {
schema: typeDefinitionArray,
graphiql: true,
context: request.session
}})
or
apolloServer(request => ({
schema: typeDefinitionArray,
graphiql: true,
context: request.session
}))
Hello i have problem with my logstash multiline configuration. I'm parsing websphere/java logs and multiline don't work on some cases of logs.
My multiline configuration looks like this. I tried several types of regex but no one worked.
codec => multiline {
pattern => "^\A%{SYSLOG5424SD}"
negate => true
what => previous
}
This is example of log that is not parsed in right way:
[1.6.2015 15:02:46:635 CEST] 00000109 BusinessExcep E CNTR0020E: EJB threw an unexpected (non-declared) exception during invocation of method "processCommand" on bean "BeanId(Issz_Produkcia_2.1.63#Ssz_Server_EJB.jar#CommandDispatcherImpl, null)". Exception data: javax.ejb.EJBTransactionRolledbackException: Transaction rolled back; nested exception is: javax.ejb.EJBTransactionRolledbackException: Transaction rolled back; nested exception is: javax.transaction.TransactionRolledbackException: Transaction is ended due to timeout
javax.ejb.EJBTransactionRolledbackException: Transaction rolled back; nested exception is: javax.transaction.TransactionRolledbackException: Transaction is ended due to timeout
javax.transaction.TransactionRolledbackException: Transaction is ended due to timeout
at com.ibm.tx.jta.impl.EmbeddableTranManagerImpl.completeTxTimeout(EmbeddableTranManagerImpl.java:62)
at com.ibm.tx.jta.impl.EmbeddableTranManagerSet.completeTxTimeout(EmbeddableTranManagerSet.java:85)
at com.ibm.ejs.csi.TransactionControlImpl.completeTxTimeout(TransactionControlImpl.java:1347)
at com.ibm.ejs.csi.TranStrategy.postInvoke(TranStrategy.java:273)
at com.ibm.ejs.csi.TransactionControlImpl.postInvoke(TransactionControlImpl.java:579)
at com.ibm.ejs.container.EJSContainer.postInvoke(EJSContainer.java:4874)
at sk.sits.upsvar.server.ejb.entitymanagers.EJSLocal0SLDokumentManagerImpl_18dd4eb4.findAllDokumentPripadByCriteriaMap(EJSLocal0SLDokumentManagerImpl_18dd4eb4.java)
at sk.sits.upsvar.server.ejb.DataAccessServiceImpl.executeDokumentCmd(DataAccessServiceImpl.java:621)
at sk.sits.upsvar.server.ejb.DataAccessServiceImpl.executeCmd(DataAccessServiceImpl.java:220)
at sk.sits.upsvar.server.ejb.EJSLocal0SLDataAccessServiceImpl_6e5b0656.executeCmd(EJSLocal0SLDataAccessServiceImpl_6e5b0656.java)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl.processSoloCommand(CommandDispatcherImpl.java:222)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl._processCommand(CommandDispatcherImpl.java:151)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl.processCommand(CommandDispatcherImpl.java:100)
at sk.sits.upsvar.server.ejb.EJSLocal0SLCommandDispatcherImpl_b974dd5c.processCommand(EJSLocal0SLCommandDispatcherImpl_b974dd5c.java)
at sk.sits.upsvar.server.ejb.SszServiceImpl.process(SszServiceImpl.java:146)
at sk.sits.upsvar.server.ejb.EJSRemote0SLSszService_8e2ee81c.process(EJSRemote0SLSszService_8e2ee81c.java)
at sk.sits.upsvar.server.ejb._EJSRemote0SLSszService_8e2ee81c_Tie.process(_EJSRemote0SLSszService_8e2ee81c_Tie.java)
at sk.sits.upsvar.server.ejb._EJSRemote0SLSszService_8e2ee81c_Tie._invoke(_EJSRemote0SLSszService_8e2ee81c_Tie.java)
at com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:678)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:525)
at com.ibm.rmi.iiop.ORB.process(ORB.java:576)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1578)
at com.ibm.rmi.iiop.Connection.doRequestWork(Connection.java:3076)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2946)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:64)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:118)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1700)
javax.ejb.EJBTransactionRolledbackException: Transaction rolled back; nested exception is: javax.transaction.TransactionRolledbackException: Transaction is ended due to timeout
Caused by: javax.transaction.TransactionRolledbackException: Transaction is ended due to timeout
at com.ibm.tx.jta.impl.EmbeddableTranManagerImpl.completeTxTimeout(EmbeddableTranManagerImpl.java:62)
at com.ibm.tx.jta.impl.EmbeddableTranManagerSet.completeTxTimeout(EmbeddableTranManagerSet.java:85)
at com.ibm.ejs.csi.TransactionControlImpl.completeTxTimeout(TransactionControlImpl.java:1347)
at com.ibm.ejs.csi.TranStrategy.postInvoke(TranStrategy.java:273)
at com.ibm.ejs.csi.TransactionControlImpl.postInvoke(TransactionControlImpl.java:579)
at com.ibm.ejs.container.EJSContainer.postInvoke(EJSContainer.java:4874)
at sk.sits.upsvar.server.ejb.entitymanagers.EJSLocal0SLDokumentManagerImpl_18dd4eb4.findAllDokumentPripadByCriteriaMap(EJSLocal0SLDokumentManagerImpl_18dd4eb4.java)
at sk.sits.upsvar.server.ejb.DataAccessServiceImpl.executeDokumentCmd(DataAccessServiceImpl.java:621)
at sk.sits.upsvar.server.ejb.DataAccessServiceImpl.executeCmd(DataAccessServiceImpl.java:220)
at sk.sits.upsvar.server.ejb.EJSLocal0SLDataAccessServiceImpl_6e5b0656.executeCmd(EJSLocal0SLDataAccessServiceImpl_6e5b0656.java)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl.processSoloCommand(CommandDispatcherImpl.java:222)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl._processCommand(CommandDispatcherImpl.java:151)
at sk.sits.upsvar.server.ejb.CommandDispatcherImpl.processCommand(CommandDispatcherImpl.java:100)
at sk.sits.upsvar.server.ejb.EJSLocal0SLCommandDispatcherImpl_b974dd5c.processCommand(EJSLocal0SLCommandDispatcherImpl_b974dd5c.java)
at sk.sits.upsvar.server.ejb.SszServiceImpl.process(SszServiceImpl.java:146)
at sk.sits.upsvar.server.ejb.EJSRemote0SLSszService_8e2ee81c.process(EJSRemote0SLSszService_8e2ee81c.java)
at sk.sits.upsvar.server.ejb._EJSRemote0SLSszService_8e2ee81c_Tie.process(_EJSRemote0SLSszService_8e2ee81c_Tie.java)
at sk.sits.upsvar.server.ejb._EJSRemote0SLSszService_8e2ee81c_Tie._invoke(_EJSRemote0SLSszService_8e2ee81c_Tie.java)
at com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:678)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:525)
at com.ibm.rmi.iiop.ORB.process(ORB.java:576)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1578)
at com.ibm.rmi.iiop.Connection.doRequestWork(Connection.java:3076)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2946)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:64)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:118)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1700)
It's parsed by lines and i need it parsed together. I don't know if there is some character which is dividing them.
I tried these patterns:
pattern => "%{DATESTAMP} %{WORD:zone}]"
pattern => "^\["
pattern => "\A"
And a lot more i don't remember them all. Can someone who faced this problem help me.
Thanks you a lot.
Here is my full configuration.
input {
file {
path => "D:\Log\Logstash\testlog.log"
type => "LOG"
start_position => "beginning"
codec => plain {
charset => "ISO-8859-1"
}
codec => multiline {
pattern => "^\A%{SYSLOG5424SD}"
negate => true
what => previous
}
}
}
filter {
grok{
match => [ "message",".*exception.*"]
add_tag => "exception"
}
mutate{
remove_tag => "_grokparsefailure"
}
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD:}\s* W"]
add_tag => "Warning"
remove_tag => "_grokparsefailure"
}
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD:}\s* F"]
add_tag => "Fatal"
remove_tag => "_grokparsefailure"
}
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD:}\s* O"]
add_tag => "Message"
remove_tag => "_grokparsefailure"
}
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD:}\s* C"]
add_tag => "Config"
remove_tag => "_grokparsefailure"
}
#if ("Warning" not in [tags]) {
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD:}\s* E"]
add_tag => "Error"
remove_tag => "_grokparsefailure"
}
#}else {
grok {
match => [ "message","%{DATESTAMP} %{WORD:}] %{WORD:} %{WORD: }\s* I"]
add_tag => "Info"
}
#}
grok {
match => [ "message", "%{DATESTAMP} %{WORD:zone}] %{WORD:ID} %{WORD:CLASS}\s* . (.*\s){0,}%{GREEDYDATA:OBSAH}" ]
remove_tag => "_grokparsefailure"
}
grok {
match => [ "message", "%{DATESTAMP} %{WORD:zone}] %{WORD:ID} %{WORD:CLASS}\s* . (.*\s){0,}%{WORD:WAS_CODE}:%{GREEDYDATA:OBSAH}" ]
#"message","%{DATESTAMP} %{WORD:zone}] %{WORD:ID} %{WORD:CLASS}\s* W \s*\[SID:%{WORD:ISSZSID}]%{GREEDYDATA:OBSAH}"]
remove_tag => "_grokparsefailure"
add_tag => "was_error"
}
if ("was_error" not in [tags]) {
grok {
match => [ "message","%{DATESTAMP} %{WORD:zone}] %{WORD:ID} %{WORD:CLASS}\s* . \s*\[SID:%{WORD:ISSZSID}]%{GREEDYDATA:OBSAH}" ]
remove_tag => "_grokparsefailure"
}
if "_grokparsefailure" not in [tags] {
if [ISSZSID] != "null" {
mutate{
add_tag => "ISSZwithID"
remove_tag => "_grokparsefailure"
}
} else {
mutate{
add_tag => "ISSZnull"
remove_tag => "_grokparsefailure"
}
}
}
}
}
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => ["127.0.0.1:9200"]
#protocol => "http"
}
}
stdout {}
}
as assumed using multiline as a codec along with another codec is rather not what it was intended for. I'd rather use it as single codec or as filter.
Transform your configuration into this and you will get the results you look for:
input {
file {
path => "D:\Log\Logstash\testlog.log"
type => "LOG"
start_position => "beginning"
codec => plain { charset => "ISO-8859-1" }
}
}
filter {
multiline {
pattern => "^\A%{SYSLOG5424SD}"
negate => true
what => previous
}
# ... all other filters
}
output {
# your output definitions
}
A famous multiline parsing example is the one from Jordan Sissle on MySQL Log parsing: https://gist.github.com/jordansissel/3753353
Cheers