When I create .net core web applications, I use the secret manager during testing. I am generally able to create a new web project (mvc and web api), right click on the project and select "manage user secrets". This opens a json file where I add the secrets. I then use this in my startup.cs something like this:
services.AddDbContext<ApplicationDbContext>(options =>
options.UseMySql(Configuration["connectionString"]));
The website works fine with this and connects well to the database. However when I try using ef core migration commands such as add-migration, they don't seem to be able to access the connection string from the secret manager. I get the error saying "connection string can't be null". The error is gone when I hard code Configuration["connectionString"] with the actual string. I have checked online and checked the .csproj file, they already contain the following lines:
<UserSecretsId>My app name</UserSecretsId>
And later:
<ItemGroup>
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="2.0.1" />
<DotNetCliToolReference Include="Microsoft.Extensions.SecretManager.Tools" Version="2.0.0" />
Is there anything I need to add so the migrations can access the connection string?
Update
I only have one constructor in the context class:
public ApplicationDBContext(DbContextOptions<ApplicationDBContext> options) : base(options)
{
}
I am currently coming across this exact problem as well. I have come up with a solution that works for now, but one may consider messy at best.
I have created a Configuration Class that provides the Configuration Interface when requested:
public static class Configuration
{
public static IConfiguration GetConfiguration()
{
return new ConfigurationBuilder()
.AddJsonFile("appsettings.json", true, true)
.AddUserSecrets<Startup>()
.AddEnvironmentVariables()
.Build();
}
}
In the Migration, you can then get the Configuration File and access its UserSecrets like this:
protected override void Up(MigrationBuilder migrationBuilder)
{
var conf = Configuration.GetConfiguration();
var secret = conf["Secret"];
}
I have tested creating a SQL Script with these User Secrets, and it works (you obviously wouldn't want to keep the Script laying around since it would expose the actual secret).
Update
The above config can also be set up into Program.cs class in the BuildWebHost method:
var config = new ConfigurationBuilder().AddUserSecrets<Startup>().Build();
return WebHost.CreateDefaultBuilder(args).UseConfiguration(config)...Build()
Or in the Startup Constructor if using that Convention
Update 2 (explanation)
It turns out this issue is because the migration scripts runs with the environment set to "Production". The secret manager is pre-set to only work in "Development" environment (for a good reason). The .AddUserSecrets<Startup>() function simply adds the secrets for all environment.
To ensure that this isn't set to your production server, there are two solutions I have noticed, one is suggested here: https://learn.microsoft.com/en-us/ef/core/miscellaneous/cli/powershell
Set env:ASPNETCORE_ENVIRONMENT before running to specify the ASP.NET Core environment.
This solution would mean there is no need to set .AddUserSecrets<Startup>() on every project created on the computer in future. However if you happen to be sharing this project across other computers, this needs to be configured on each computer.
The second solution is to set the .AddUserSecrets<Startup>() only on debug build like this:
return new ConfigurationBuilder()
.AddJsonFile("appsettings.json", true, true)
#if DEBUG
.AddUserSecrets<Startup>()
#endif
.AddEnvironmentVariables()
.Build();
Additional Info
The Configuration Interface can be passed to Controllers in their Constructor, i.e.
private readonly IConfiguration _configuration;
public TestController(IConfiguration configuration)
{
_configuration = configuration;
}
Thus, any Secrets and Application Setting are accessible in that Controller by accessing _configuration["secret"].
However, if you want to access Application Secrets from, for example, a Migration-File, which exists outside of the Web Application itself, you need to adhere to the original answer because there's no easy way (that I know of) to access those secrets otherwise (one use case I can think of would be seeding the Database with an Admin and a Master Password).
To use migrations in NetCore with user secrets we can also set a class (SqlContextFactory) to create its own instance of the SqlContext using a specified config builder. This way we do not have to create some kind of workaround in our Program or Startup classes. In the below example SqlContext is an implementation of DbContext/IdentityDbContext.
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
using Microsoft.Extensions.Configuration;
public class SqlContextFactory : IDesignTimeDbContextFactory<SqlContext>
{
public SqlContext CreateDbContext(string[] args)
{
var config = new ConfigurationBuilder()
.AddJsonFile("appsettings.json", optional: false)
.AddUserSecrets<Startup>()
.AddEnvironmentVariables()
.Build();
var builder = new DbContextOptionsBuilder<SqlContext>();
builder.UseSqlServer(config.GetConnectionString("DefaultConnection"));
return new SqlContext(builder.Options);
}
}
Since I have noticed a lot of people running into this confusion, I am writing a simplified version of this resolution.
The Problem/Confusion
The secret manager in .net core is designed to work only in the Development environment. When running your app, your launchSettings.json file ensures that your ASPNETCORE_ENVIRONMENT variable is set to "Development". However, when you run EF migrations it doesn't use this file. As a result, when you run migrations, your web app does not run on the Development environment and thus no access to the secret manager. This often causes confusion as to why EF migrations can't use the secret manager.
The Resolution
Make sure your environment variable "ASPNETCORE_ENVIRONMENT" is set to "Development" in your computer.
The way of using .AddUserSecrets<Startup>() will make a circular reference if we having our DbContext in a separate class library and using DesignTimeFactory
The clean way of doing that is:
public class DesignTimeDbContextFactory : IDesignTimeDbContextFactory<AppDbContext>
{
public AppDbContext CreateDbContext(string[] args)
{
var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
#if DEBUG
.AddJsonFile(#Directory.GetCurrentDirectory() +
"{project path}/appsettings.Development.json",
optional: true, reloadOnChange: true)
#else
.AddJsonFile(#Directory.GetCurrentDirectory() +
"{startup project path}/appsettings.json",
optional: true, reloadOnChange: true)
#endif
.AddEnvironmentVariables()
.Build();
var connectionString = configuration.GetConnectionString("DefaultConnection");
var builder = new DbContextOptionsBuilder<AppDbContext>();
Console.WriteLine(connectionString);
builder.UseSqlServer(connectionString);
return new AppDbContext(builder.Options);
}
}
The Explanation:
Secret Manager is meant to be in the development time only, so this will not affect the migration in case if you have it in a pipeline in QA or Production stages, so to fix that we will use the dev connection string which exists in appsettings.Development.json during the #if Debug.
The benefit of using this way is to decouple referencing the Web project Startup class while using class library as your Data infrastructure.
Related
I stored my MySQL DB credentials in AWS secrets manager using the Credentials for other database option. I want to import these credentials in my application.properties file. Based on a few answers I found in this thread, I did the following:
Added the dependency spring-cloud-starter-aws-secrets-manager-config
Added spring.application.name = <application name> and spring.config.import = aws-secretsmanager: <Secret name> in application.properties
Used secret keys as place holders in the following properties:
spring.datasource.url = jdbc:mysql://${host}:3306/db_name
spring.datasource.username=${username}
spring.datasource.password=${password}
I am getting the following error while running the application:
java.lang.IllegalStateException: Unable to load config data from 'aws-secretsmanager:<secret_name>'
Caused by: java.lang.IllegalStateException: File extension is not known to any PropertySourceLoader. If the location is meant to reference a directory, it must end in '/' or File.separator
First, is the process I am following correct? If yes, what is this error regarding and how to resolve this?
I found the problem that was causing the error. Apparently I was adding the wrong dependency.
According to the latest docs, the configuration support for using spring.config.import to import AWS secrets has been moved to io.awspring.cloud from org.springframework.cloud. So the updated dependency would be io.awspring.cloud:spring-cloud-starter-aws-secrets-manager-config:2.3.3 and NOT org.springframework.cloud:spring-cloud-starter-aws-secrets-manager-config:2.2.6
You are trying to use spring.config.import, and the support for this was introduced in Spring Cloud 2.3.0:
https://spring.io/blog/2021/03/17/spring-cloud-aws-2-3-is-now-available
Secrets Manager
Support loading properties through spring.config.import, introduced in Spring Cloud 2020.0 Read more about integrating your
Spring Cloud applicationwiththe AWS secrets manager.
Removed the dependency to auto-configure module #526.
Dropped the dependency to javax.validation:validation-api.
Allow Secrets Manager prefix without “/” in the front #736.
In spring-cloud 2020.0.0 (aka Ilford), the bootstrap phase is no
longer enabled by default. In order enable it you need an additional
dependency:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-bootstrap</artifactId>
<version>{spring-cloud-version}</version>
</dependency>
However, starting at spring-cloud-aws 2.3, allows import default aws'
secretsmanager keys (spring.config.import=aws-secretsmanager:) or
individual keys
(spring.config.import=aws-secretsmanager:secret-key;other-secret-key)
https://github.com/spring-cloud/spring-cloud-aws/blob/main/docs/src/main/asciidoc/secrets-manager.adoc
application.yml
spring.config.import: aws-secretsmanager:/secrets/spring-cloud-aws-sample-app
Or try to leave it empty:
spring.config.import=aws-secretsmanager:
As such, it will take spring.application.name by default,
App:
#SpringBootApplication
public class App {
private static final Logger LOGGER = LoggerFactory.getLogger(App.class);
public static void main(String[] args) {
SpringApplication.run(App.class, args);
}
#Bean
ApplicationRunner applicationRunner(#Value("${password}") String password) {
return args -> {
LOGGER.info("`password` loaded from the AWS Secret Manager: {}", password);
};
}
}
I am integrating my application with AWS parameter store. For local development which may have no access to AWS I need to disable fetching property values from AWS and use values from application.yml. The issue seems to be not application.yml, but the dependencies: as soon as AWS starter appears in POM, AWS integration is being initialized: Spring is trying to use AwsParamStorePropertySourceLocator. I guess what I need to do is to force my application to use Spring's property source locator regardless of AWS jar being on the class path. Not sure how to do that.
For parameter store it is quite easy: AwsParamStoreBootstrapConfiguration bean is conditional on property aws.paramstore.enabled. Creating aws.paramstore.enabled environment variable and setting its value to false will disable AWS parameter store.
I also tried disabling AWS secrets manager and setting aws.secretsmanager.enabled to false is not sufficient. To fully disable it I had to disable auto configuration for few classes:
import org.springframework.cloud.aws.autoconfigure.context.ContextCredentialsAutoConfiguration;
import org.springframework.cloud.aws.autoconfigure.context.ContextInstanceDataAutoConfiguration;
import org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration;
import org.springframework.cloud.aws.autoconfigure.context.ContextResourceLoaderAutoConfiguration;
import org.springframework.cloud.aws.autoconfigure.context.ContextStackAutoConfiguration;
import org.springframework.cloud.aws.autoconfigure.mail.MailSenderAutoConfiguration;
#Configuration
#Profile("local")
#EnableAutoConfiguration(exclude = { ContextCredentialsAutoConfiguration.class,
ContextInstanceDataAutoConfiguration.class, ContextRegionProviderAutoConfiguration.class,
ContextResourceLoaderAutoConfiguration.class, ContextStackAutoConfiguration.class,
MailSenderAutoConfiguration.class })
public class LocalScanConfig {
}
After learning JUnit and experienced its benefits for both programmer and the project, I wanted now to unit test the service layer of each entities and test if each methods works properly.
As of now, I already have created a unit test for all of my service classes but the problem is that the datasource's data isn't suited for testing. Thus I have to created another database for service layer testing and configure the datasource for the unit test of the service layers. But the things is I don't know how to configure another datasource which only the src/test/java could access and couldn't be accessed upon production. I'm still new to SpringBoot and SpringData so I'm asking how to configure such requirements here.
As of now I have this application.properties configuration.
spring.datasource.url=<DatabaseURL>
spring.datasource.username=<DatabaseUsername>
spring.datasource.password=<DatabasePassword>
spring.datasource.driver-class-name=<DatabaseDriver>
// another datasource configuration
And here's a sample code for a service class. Which uses the application.properities - dataSource configuration.
#Service
public class FooService {
#PersistenceContext
private EntityManager entityManager;
public List<Foo> findAllByFooForm(FooForm fooForm) {
// JPA CriteriaBuilder query accroding to FooForm
return entityManager.createQuery(query).getResultList();
}
}
Finally, here's a sample code for unit test of a service class.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
public class FooServiceTest {
#AutoWired
private FooService fooService
#Test
public void testFindAllByFooForm() {
// Test statements
}
}
There are a few approaches which can be combined to give you good control over this.
First of all, if you create src/test/resources/application.properties, then that will only be available on the classpath during testing. It will override any properties that you have defined in src/main/resouces/application.properties.
If you are using an in-memory database to support those tests, then you can ensure that different import.sql files are loaded, through the use of the following property:
spring.jpa.properties.hibernate.hbm2ddl.import_files=import-test1.sql
That annotation takes a comma-separated list of import scripts, so you can have a base set of data loaded by one script and additional (test-specific perhaps) data loaded by others.
If you wish to connect to a different database in each test, or cause different import scripts to be used, then you can use profiles to trigger this. If you create a properties file application-test1.properties, then the test itself can cause that to be loaded using the annotation: #ActiveProfiles({"test1"}).
I'm attempting to add a custom header filter in my Dropwizard instance to check to see if the request's version is synced to the Dropwizard instance's version.
I see you can use FilterBuilder to add jetty CrossOriginFilters. However, I am having trouble figuring out how to set a custom filter.
Thanks
Via the Environment class.
https://dropwizard.github.io/dropwizard/manual/core.html#environments
#Override
public void run(MyApplicationConfiguration configuration, Environment environment) {
environment.servlets().addFilter("Custom-Filter-Name", new MyCustomFilter()).addMappingForUrlPatterns(EnumSet.allOf(DispatcherType.class), true, "/*");
}
You can choose which Dispatch types by changing EnumSet.allOf(DispatcherType.class)
This is how I got it to work using Dropwwizard 0.7.1 (APIs appear to have changed from other examples I found out there)
In run method of your application:
final FilterRegistration.Dynamic cors = environment.servlets().addFilter("crossOriginRequsts", CrossOriginFilter.class);
cors.addMappingForUrlPatterns(EnumSet.allOf(DispatcherType.class), true, "/*");
https://gist.github.com/craigbeck/fb71818063175b9b4210
Is it possible to generate fixtures from an existing DB in Symfony2/Doctrine? How could I do that?
Example:
I have defined 15 entities and my symfony2 application is working. Now some people are able to browse to the application and by using it it had inserted about 5000 rows until now. Now I want the stuff inserted as fixtures, but I don’t want to do this by hand. How can I generate them from the DB?
There's no direct manner within Doctrine or Symfony2, but writing a code generator for it (either within or outside of sf2) would be trivial. Just pull each property and generate a line of code to set each property, then put it in your fixture loading method. Example:
<?php
$i = 0;
$entities = $em->getRepository('MyApp:Entity')->findAll();
foreach($entities as $entity)
{
$code .= "$entity_{$i} = new MyApp\Entity();\n";
$code .= "$entity_{$i}->setMyProperty('" . addslashes($entity->getMyProperty()); . "'); \n");
$code .= "$manager->persist($entity_{$i}); \n $manager->flush();";
++$i;
}
// store code somewhere with file_put_contents
As I understand your question, you have two databases: the first is already in production and filled with 5000 rows, the second one is a new database you want to use for new test and development. Is that right ?
If it is, I suggest you to create in you test environment two entity manager: the first will be the 'default' one, which will be used in your project (your controllers, etc.). The second one will be used to connect to your production database. You will find here how to deal with multiple entity manager : http://symfony.com/doc/current/cookbook/doctrine/multiple_entity_managers.html
Then, you should create a Fixture class which will have access to your container. There is an "how to" here : http://symfony.com/doc/current/bundles/DoctrineFixturesBundle/index.html#using-the-container-in-the-fixtures.
Using the container, you will have access to both entity manager. And this is the 'magic': you will have to retrieve the object from your production database, and persist them in the second entity manager, which will insert them in your test database.
I point your attention to two points:
If there are relationship between object, you will have to take care to those dependencies: owner side, inversed side, ...
If you have 5000 rows, take care on the memory your script will use. Another solution may be use native sql to retrieve all the rows from your production database and insert them in your test database. Or a SQL script...
I do not have any code to suggest to you, but I hope this idea will help you.
I assume that you want to use fixtures (and not just dump the production or staging database in the development database) because a) your schema changes and the dumps would not work if you update your code or b) you don't want to dump the hole database but only want to extend some custom fixtures. An example I can think of is: you have 206 countries in your staging database and users add cities to those countries; to keep the fixtures small you only have 5 countries in your development database, however you want to add the cities that the user added to those 5 countries in the staging database to the development database
The only solution I can think of is to use the mentioned DoctrineFixturesBundle and multiple entity managers.
First of all you should configure two database connections and two entity managers in your config.yml
doctrine:
dbal:
default_connection: default
connections:
default:
driver: %database_driver%
host: %database_host%
port: %database_port%
dbname: %database_name%
user: %database_user%
password: %database_password%
charset: UTF8
staging:
...
orm:
auto_generate_proxy_classes: %kernel.debug%
default_entity_manager: default
entity_managers:
default:
connection: default
mappings:
AcmeDemoBundle: ~
staging:
connection: staging
mappings:
AcmeDemoBundle: ~
As you can see both entity managers map the AcmeDemoBundle (in this bundle I will put the code to load the fixtures). If the second database is not on your development machine, you could just dump the SQL from the other machine to the development machine. That should be possible since we are talking about 500 rows and not about millions of rows.
What you can do next is to implement a fixture loader that uses the service container to retrieve the second entity manager and use Doctrine to query the data from the second database and save it to your development database (the default entity manager):
<?php
namespace Acme\DemoBundle\DataFixtures\ORM;
use Doctrine\Common\DataFixtures\FixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use Symfony\Component\DependencyInjection\ContainerAwareInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;
use Acme\DemoBundle\Entity\City;
use Acme\DemoBundle\Entity\Country;
class LoadData implements FixtureInterface, ContainerAwareInterface
{
private $container;
private $stagingManager;
public function setContainer(ContainerInterface $container = null)
{
$this->container = $container;
$this->stagingManager = $this->container->get('doctrine')->getManager('staging');
}
public function load(ObjectManager $manager)
{
$this->loadCountry($manager, 'Austria');
$this->loadCountry($manager, 'Germany');
$this->loadCountry($manager, 'France');
$this->loadCountry($manager, 'Spain');
$this->loadCountry($manager, 'Great Britain');
$manager->flush();
}
protected function loadCountry(ObjectManager $manager, $countryName)
{
$country = new Country($countryName);
$cities = $this->stagingManager->createQueryBuilder()
->select('c')
->from('AcmeDemoBundle:City', 'c')
->leftJoin('c.country', 'co')
->where('co.name = :country')
->setParameter('country', $countryName)
->getQuery()
->getResult();
foreach ($cities as $city) {
$city->setCountry($country);
$manager->persist($city);
}
$manager->persist($country);
}
}
What I did in the loadCountry method was that I load the objects from the staging entity manager, add a reference to the fixture country (the one that already exists in your current fixtures) and persist it using the default entity manager (your development database).
Sources:
DoctrineFixturesBundle
How to work with Multiple Entity Managers
you could use https://github.com/Webonaute/DoctrineFixturesGeneratorBundle
It add ability to generate fixtures for single entity using commands like
$ php bin/console doctrine:generate:fixture --entity=Blog:BlogPost --ids="12 534 124" --name="bug43" --order="1"
Or you can create full snapshot
php app/console doctrine:generate:fixture --snapshot --overwrite
The Doctrine Fixtures are useful because they allow you to create objects and insert them into the database. This is especially useful when you need to create associations or say, encode a password using one of the password encoders. If you already have the data in a database, you shouldn't really need to bring them out of that format and turn it into PHP code, only to have that PHP code insert the same data back into the database. You could probably just do an SQL dump and then re-insert them into your database again that way.
Using a fixture would make more sense if you were initiating your project but wanted to use user input to create it. If you had in your config file the default user, you could read that and insert the object.
The AliceBundle can help you doing this. Indeed it allows to load fixtures with YAML (or PHP array) files.
For instance you can define your fixtures with:
Nelmio\Entity\Group:
group1:
name: Admins
owner: '#user1->id'
Or with the same structure in a PHP array. It's WAY easier than generating working PHP code.
It also supports references:
Nelmio\Entity\User:
# ...
Nelmio\Entity\Group:
group1:
name: Admins
owner: '#user1'
In the doctrine_fixture cookbook, you can see in the last example how to get the service container in your entity.
With this service container, you can retrieve the doctrine service, then the entity manager. With the entity manager, you will be able to get all the data from your database you need.
Hope this will help you!