Is there a way to unit test localization resources in asp.net core class libarary? - unit-testing

I want to have a resource class library and test it in another mstest class libary.
this is my project.json in resource library : (my resx file is in Resources folder)
{
"version": "1.0.0-*",
"dependencies": {
"NETStandard.Library": "1.6.0",
"Microsoft.AspNetCore.Mvc.DataAnnotations": "1.0.1"
},
"buildOptions": {
"embed": {
"include": [ "Resources/PS.ModelBase.Models.Person.fa.resx" ]
}
},
"frameworks": {
"netstandard1.6": {
"imports": "dnxcore50"
}
}
}
and this is my test method :
using Microsoft.VisualStudio.TestTools.UnitTesting;
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Resources;
using System.Threading.Tasks;
namespace PS.ModelBaseTest
{
[TestClass]
public class PersonTest
{
[TestMethod]
public void FullNameValidateNullFirstName()
{
ResourceManager rm = new ResourceManager(typeof(PS.ModelBase.Models.Person));
string test = rm.GetString("Fisrt Name", new System.Globalization.CultureInfo("fa-IR"));
Assert.AreEqual("Fisrt Name", test);
}
}
}
I have this error when I run the test :
Could not find any resources appropriate for the specified culture or
the neutral culture. Make sure "PS.ModelBase.Models.Person.resources"
was correctly embedded or linked into assembly "PS.ModelBase" at
compile time, or that all the satellite assemblies required are
loadable and fully signed.
what should I do ?
ŮŽUpdate :
Ok. I solved my problem by inserting resx to the Models folder (Where the Person class is in it ) and renamed resx files to class name (like Person.en.resx) .
Nothing to do with project.json :
{
"dependencies": {
"NETStandard.Library": "1.6.0",
"Microsoft.AspNetCore.Mvc.DataAnnotations": "1.0.1"
},
"frameworks": {
"netstandard1.6": {
"imports": "dnxcore50"
}
},
"version": "1.0.0-*"
}
But I faced another problem .I want to remain my resx files in Resources folder , not in Models folder! How can embed resx file to person class when is in Resources folder ?
I got the second problem. In fact the resource file and the class must be in a same domain

Using the following two resources as reference :
Github Issue: Updates to project.json schema
project.json reference
Based on project file in question, either try
"includeFiles" string/string[]
A list of file paths to include. The paths are rooted at the project folder. This list has a higher
priority than the include and exclude globbing patterns, hence a file
listed here and in the exclude globbing pattern will still be
included. Defaults to none.
"buildOptions": {
"embed": {
"includeFiles": [ "Resources/PS.ModelBase.Models.Person.fa.resx" ]
}
}
for just the one resource file, or
"include" string/string[]
A list of file globbing patterns for files
to include. The patterns are rooted at the project folder. Defaults to
none.
"buildOptions": {
"embed": {
"include": [ "Resources" ]
}
}
for the contents of the Resources folder

Related

karma config fails to add external project or file in the current project

I have added unit tests in some frontend projects using karma. I have multiple projects in my Git folder. If I run them individually, they work fine. However if there is a dependency of one project in another project it fails to include it. (failed to load JavaScript resource:)
If I run the tests using the html file directly, it runs the tests normally and even loads the external projects without any error. following are my resource roots in my unitTest.qunit.html file:
data-sap-ui-resourceroots='{
"x.y.projectmain": "../../",
"test.unit": "./",
"x.y.project2": "../../../../project2/WebContent"
}'
If I try to include the project same way in my Karma.conf.js it gives an error:
"Failed to resolve dependencies of 'x/y/projectmain/test/unit/AllTests.js' -> 'x/y/projectmain/test/unit/myUnitTest.js' -> 'x.y.project2/util/myfile.js': failed to load 'x.y.project2/util/myfile.js' from ./../../project2/WebContent/util/myfile.js: script load error"
Following are some of my Karma.conf.js settings:
ui5: {
type: "library",
paths: {
src: "projectmain/WebContent",
test: "projectmain/WebContent/test"
},
url: "https://openui5.hana.ondemand.com",
mode: "script",
config: {
async: true,
bindingSyntax: "complex",
compatVersion: "edge",
resourceRoots: {
"x.y.projectmain": "./base/projectmain/WebContent",
// "x.y.project2": path.resolve('../project2/WebContent')
"x.y.project2": "./../../projet2/WebContent"
// "x.y.project2": "./base/projectmain/WebContent/test/resources/project2/WebContent"
// "x.y.project2.util": "./base/project2/WebContent/util"
}
}
,
tests: [
"x.y.projectmain/test/unit/AllTests"
]
},
files: [
'Utils.js',
{ pattern: "../public/Project2/WebContent/utils/myfile.js", included: false, served: true, watched: false, nocache: true },
{ pattern: '../Project2/WebContent/**/*', watched: true, served: true, included: false }
],
// proxies: {
// '/project2/': path.resolve('../../project2/WebContent')
// },
proxies: {
'/x.y.project2/': '/absolute/' + path.resolve('../project2/WebContent'),
'/myfile.js/': '../public/project2/WebContent/util/myfile.js'
},
I have tried many things here. It even refers to the exact file in that external project but it just cant load the file. If I try to load the file manually in the browser it opens fine. But with Karma it gives an error.
My ultimate goal is to add one project as a dependency inside another project. I did check it by copying the whole WebContent folder from Project 2 inside the 'ProjectMain/WebContent/test/Resources/' directory It does work, but that's not appropriate way to include it.
There must be some way where we can register or include one project in another either as a resource root or proxies.

Create a Sharepoint list in a specific folder with Microsoft graph api

Premise:
I have a folder under the sharepoint site for e.g.
https://... <mycompany-sharepoint-site.com>/
under which we have folder so my url is something like
https://... <mycompany-sharepoint-site.com>/Documents/Sub_folder_1/Sub_folder_2
pertaining to our project.
I need to be able to create a Sharepoint list in the Sub_folder_2 - folder and not at the root level.
With Sharepoint - GraphApi - create list api url
POST https://graph.microsoft.com/v1.0/sites/{site-id}/lists
I will ONLY be able to create at the <mycompany-sharepoint-site.com> level (i.e. at the root level) which is not what I want.
FYI, I already tried (on Postman) to go with the drives//items/<folder_id> - route or I should say attempted to do so but failed.
Any help is greatly appreciated.
If you want to use Graph API to create a folder in SharePoint, please use the following query:
POST /groups/{group-id}/drive/items/{parent-item-id}/children
For more information: https://learn.microsoft.com/en-us/graph/api/driveitem-post-children?view=graph-rest-1.0&tabs=http#http-request
Hope this is helpful.
I don't think that SharePoint supports creating a list inside the folder but you can at least try to create a list and specify the path in the parent reference.
You need to find out the drive id.
POST https://graph.microsoft.com/v1.0/sites/{site_id}/lists
{
"displayName": "Test",
"columns": [
{
"name": "Column1",
"text": {}
},
{
"name": "Column2",
"number": {}
}
],
"list": {
"template": "genericList"
},
"parentReference": {
"driveType": "documentLibrary",
"driveId": "{drive_id}",
"path": "/drives/{drive_id}/root:/Documents/Sub_folder_1/Sub_folder_2"
# or
# "path": "/drives/{drive_id}/root:/Sub_folder_1/Sub_folder_2"
}
}
I don't think it's even possible at all, did you succeed that manually?

How to create an AWS SSM Document Package using Terraform

Using Terraform, I am trying to create an AWS SSM Document Package for Chrome so I can install it on various EC2 instances I have. I define these steps via terraform:
Upload zip containing Chrome installer plus install and uninstall powershell scripts.
Add that ZIP to an SSM package.
However, when I execute terraform apply I receive the following error...
Error updating SSM document: InvalidParameterValueException:
AttachmentSource not provided in the input request.
status code: 400, request id: 8d89da70-64de-4edb-95cd-b5f52207794c
The contents of my main.tf are as follows:
# 1. Add package zip to s3
resource "aws_s3_bucket_object" "windows_chrome_executable" {
bucket = "mybucket"
key = "ssm_document_packages/GoogleChromeStandaloneEnterprise64.msi.zip"
source = "./software-packages/GoogleChromeStandaloneEnterprise64.msi.zip"
etag = md5("./software-packages/GoogleChromeStandaloneEnterprise64.msi.zip")
}
# 2. Create AWS SSM Document Package using zip.
resource "aws_ssm_document" "ssm_document_package_windows_chrome" {
name = "windows_chrome"
document_type = "Package"
attachments_source {
key = "SourceUrl"
values = ["/path/to/mybucket"]
}
content = <<DOC
{
"schemaVersion": "2.0",
"version": "1.0.0",
"packages": {
"windows": {
"_any": {
"x86_64": {
"file": "GoogleChromeStandaloneEnterprise64.msi.zip"
}
}
}
},
"files": {
"GoogleChromeStandaloneEnterprise64.msi.zip": {
"checksums": {
"sha256": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
}
}
}
DOC
}
If I change the file from a zip to a vanilla msi I do not receive the error message, however, when I navigate to the package in the AWS console it tells me that the install.ps1 and uninstall.ps1 files are missing (since obviously they weren't included).
Has anyone experienced the above error and do you know how to resolve it? Or does anyone have reference to a detailed example of how to do this?
Thank you.
I ran into this same problem, in order to fix it I added a trailing slash to the source url value parameter:
attachments_source {
key = "SourceUrl"
values = ["/path/to/mybucket/"]
}
My best guess is it appends the filename from the package spec to the value provided in the attachments source value so it needs the trailing slash to build a valid path to the actual file.
This is the way it should be set up for an attachment in s3:
attachments_source {
key = "S3FileUrl"
values = ["s3://packer-bucket/packer_1.7.0_linux_amd64.zip"]
name = "packer_1.7.0_linux_amd64.zip"
}
I realized that in the above example there was no way terraform could identify a dependency between the two resources i.e. the s3 object needs to be created before the aws_ssm_document. Thus, I added in the following explicit dependency inside the aws_ssm_document:
depends_on = [
aws_s3_bucket_object.windows_chrome_executable
]

How to upload large amounts of stopwords into AWS Elasticsearch

Is it possible to upload a stopwords.txt onto AWS Elasticsearch and specify it as a path by stop token filter?
If your using aws elasticsearch, the only option to do this is using the elasticsearch rest APIs.
To import large data sets, you can use the bulk API.
Edit: You can now upload "packages" to AWS Elasticsearch service, which lets you add custom lists of stopwords etc. See https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/custom-packages.html
No, it isn't possible to upload a stopwords.txt file to the hosted AWS Elasticsearch service.
What you will have to do is specify the stopwords in a custom analyzer. More details on how to do that can be found in the official documentation.
The official documentation then says to "close and reopen" the index, but again, AWS Elasticsearch doesn't allow that, so you will then have to reindex.
Example:
1. Create an index with your stopwords listed inline within a custom analyzer, e.g.
PUT /my_new_index
{
"settings": {
"analysis": {
"analyzer": {
"english_analyzer": {
"type": "english",
"stopwords": "['a', 'the', 'they', 'and']"
}
}
}
}
}
2. Reindex
POST _reindex
{
"source": {
"index": "my_index"
},
"dest": {
"index": "my_new_index"
}
}
Yes it is possible by setting stopwords_path while defining your stop token filter.
stopwords_path => A path (either relative to config location, or
absolute) to a stopwords file configuration. Each stop word should be
in its own "line" (separated by a line break). The file must be UTF-8
encoded.
Here is how I did it.
Copied stopwords.txt file in the config folder of my elasticsearch home path.
Created a custom token filter with the path set in stopwords_path
PUT /testindex
{
"settings": {
"analysis": {
"filter": {
"teststopper": {
"type": "stop",
"stopwords_path": "stopwords.txt"
}
}
}
}
}
Verified if the filter was working as expected with _analyze API.
GET testindex/_analyze
{
"tokenizer" : "standard",
"token_filters" : ["teststopper"],
"text" : "this is a text to test the stop filter",
"explain" : true,
"attributes" : ["keyword"]
}
The tokens 'a', 'an', 'the', 'to', 'is' were filtered out since I had added them in config/stopwords.txt file.
For more info:
https://www.elastic.co/guide/en/elasticsearch/reference/current/analysis-stop-tokenfilter.html
https://www.elastic.co/guide/en/elasticsearch/reference/2.2/_explain_analyze.html

Composer require branch name

For example I want to require:
{
"repositories": [
{
"type": "git",
"url": "https://github.com/google/google-api-php-client.git"
}
],
"require": {
"google/apiclient": "v1-master"
}
}
In this example I try require google/apiclient on branch v1-master. I get error:
[UnexpectedValueException]
Could not parse version constraint v1-master: Invalid version string "v1-master"
You need to prefix all dev branches (= non tagged) by dev-.
To install the branch you need, use:
composer require google/apiclient:dev-v1-master
See composer docs.
this will work :
{
"repositories": [
{
"type": "git",
"url": "https://github.com/google/google-api-php-client.git"
}
],
"require": {
"google/apiclient": "dev-BRANCH_NAME"
}
}
so pattern is "dev-*", if you branch name is "bug-fix" then "dev-bug-fix"
with command line :
composer require google/apiclient:dev-BRANCH_NAME
I was trying to the same for a different Google repository which contains several packages and it took me some time to figure it out. Therefore I am sharing my solution below.
My goal is to pull latest google/cloud-compute from https://github.com/googleapis/google-cloud-php.git within master branch.
Following steps worked for me:
Clone the repository
git clone https://github.com/googleapis/google-cloud-php.git google-cloud-php
Set composer.json to use the right package from local folder:
{
"repositories": [
{
"type": "path",
"url": "/Users/USERNAME/projects/google-cloud-php/Compute"
}
],
"require": {
"google/cloud-compute": "dev-master"
}
}
Please note that in step 2 the url is pointing to the Compute subfolder where the actual google/cloud-compute package exists.
My solution could be easily tweaked for any branch, you would just need to git checkout the appropriate branch in step 1 and then change 'dev-master' to 'dev-YOUR_BRANCH' in step 2.