Draw.IO: Foreign Key on SQL Plugin - draw.io

Does Draw.IO support foreign key relationships?
Tested it with a lot of different SQL samples (w3schools sql foreignkey) but none worked.

Due to budget issues i was only able to implement it to work with MySQL and SQL Server and SQL Server Generated Scripts.
The pull request is pending on https://github.com/jgraph/drawio/pull/233.
MySQL Example:
CREATE TABLE Persons
(
PersonID int,
LastName varchar(255),
FirstName varchar(255),
Address varchar(255),
City varchar(255)
);
CREATE TABLE Orders (
OrderID int NOT NULL,
OrderNumber int NOT NULL,
PersonID int,
PRIMARY KEY (OrderID),
FOREIGN KEY (OrderID) REFERENCES Persons(PersonID)
);
SQL Server Example:
CREATE TABLE Persons
(
PersonID int,
LastName varchar(255),
FirstName varchar(255),
Address varchar(255),
City varchar(255)
);
CREATE TABLE Orders (
OrderID int NOT NULL,
PRIMARY KEY (OrderID),
CONSTRAINT FK_PersonOrder FOREIGN KEY (OrderID)
REFERENCES Persons(PersonID)
);
SQL Server Generated Script Example:
CREATE TABLE [dbo].[aspnet_Applications](
[ApplicationName] [nvarchar](256) NOT NULL,
[LoweredApplicationName] [nvarchar](256) NOT NULL,
[ApplicationId] [uniqueidentifier] NOT NULL,
[Description] [nvarchar](256) NULL,
PRIMARY KEY NONCLUSTERED
(
[ApplicationId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
UNIQUE NONCLUSTERED
(
[LoweredApplicationName] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
UNIQUE NONCLUSTERED
(
[ApplicationName] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[aspnet_Users](
[ApplicationId] [uniqueidentifier] NOT NULL,
[UserId] [uniqueidentifier] NOT NULL,
[UserName] [nvarchar](256) NOT NULL,
[LoweredUserName] [nvarchar](256) NOT NULL,
[MobileAlias] [nvarchar](16) NULL,
[IsAnonymous] [bit] NOT NULL,
[LastActivityDate] [datetime] NOT NULL,
PRIMARY KEY NONCLUSTERED
(
[UserId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[aspnet_Users] WITH CHECK ADD FOREIGN KEY([ApplicationId])
REFERENCES [dbo].[aspnet_Applications] ([ApplicationId])
GO

The PR was merged, you can access this plugin via Arrange, Insert, From SQL at https://www.draw.io/?splash=0&p=sql

I rewrote the sql parser by #brunomartinspro to be more forgiving on spaces and case sensativity and support more sql database types. Also foreign key relationships now work and are drawn properly, just waiting for pr to be merged in.
https://github.com/jgraph/drawio/pull/3091

Related

mariadb won't create foreign key

I think I'm using the right syntax for MariaDB, but my foreign key constraint is not being created.
Here's the create table DDL:
CREATE TABLE items (
id INT auto_increment primary key,
description TEXT NOT NULL
);
CREATE TABLE item_events (
id INT NOT NULL,
calendar_event_guid TEXT(255) NOT NULL,
foreign key item_events_id_fk (id) REFERENCES items (id)
);
Then, when I ask MariaDB to show me what I created, I get this:
+-------------+-----------------
| Table | Create Table +-------------+-----------------
| item_events | CREATE TABLE `item_events` (
`id` int(11) NOT NULL,
`calendar_event_guid` text COLLATE utf8_unicode_ci NOT NULL,
KEY `item_events_id_fk` (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
+-------------+-----------------
or, just showing the DDL:
CREATE TABLE `item_events` (
`id` int(11) NOT NULL,
`calendar_event_guid` text COLLATE utf8_unicode_ci NOT NULL,
KEY `item_events_id_fk` (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
Notice that it only created a "KEY", not a foreign key. The items table is correctly created.
Surely, this is really simple :)

WSO2 APIM Analytics SQL Server Installation

I am trying to run the APIM Analytics but want to go away from default H2 databases and use SQL Server instead.
Here are the mappings of the database in SQL Server:
${sys:carbon.home}/wso2/dashboard/database/metrics ---> WSO2_APIM_ANALYTICS_METRICS
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/WSO2_CARBON_DB --> WSO2_APIM_ANALYTICS_CARBON
${sys:carbon.home}/wso2/dashboard/database/MESSAGE_TRACING_DB --> WSO2_APIM_ANALYTICS_MESSAGE_TRACING
${sys:carbon.home}/wso2/worker/database/GEO_LOCATION_DATA --> WSO2_APIM_ANALYTICS_GEO_LOCATION_DATA
${sys:carbon.home}/wso2/worker/database/WSO2AM_MGW_ANALYTICS_DB --> WSO2_APIM_ANALYTICS_MICROGATEWAY_ANALYTICS
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/SP_MGT_DB --> WSO2_APIM_ANALYTICS_SP_MGT_DB
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/DASHBOARD_DB --> WSO2_APIM_ANALYTICS_DASHBOARD
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/SAMPLE_DB --> WSO2_APIM_ANALYTICS_SAMPLE
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/wso2_status_dashboard --> WSO2_APIM_ANALYTICS_STATUS_DASHBOARD
${sys:carbon.home}/wso2/worker/database/WSO2AM_STATS_DB --> WSO2_METRICS
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/BUSINESS_RULES_DB --> WSO2_APIM_ANALYTICS_BUSINESS_RULES
${sys:carbon.home}/wso2/${sys:wso2.runtime}/database/PERMISSION_DB --> WSO2_APIM_ANALYTICS_PERMISSIONS
${sys:carbon.home}/wso2/worker/database/WSO2AM_MGW_ANALYTICS_DB --> WSO2_APIM_ANALYTICS_MICROGATEWAY_ANALYTICS
${sys:carbon.home}/wso2/worker/database/GEO_LOCATION_DATA --> WSO2_APIM_ANALYTICS_GEO_LOCATION_DATA
I updated deployment.yaml for all three worker, manager and dashboard functionality to point to a new data source.
When I try to run the worker.bat, I get the following error messages for sidhi. It looks like schema and data for other databases are not populated as it is for h2.
How can I get the schema for all the databases that h2 uses and populate in SQL Server?
I also opened h2 database but don't see anything in h2 database in public schema. Am I missing something?
Here are the errors I see when I start the worker node:
{org.wso2.transport.http.netty.listener.ServerConnectorBootstrap$HTTPServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9444
[2019-04-09 14:22:59,446] ERROR {org.wso2.carbon.stream.processor.core.internal.StreamProcessorDeployer} - org.wso2.siddhi.core.exception.SiddhiAppCreationException: Error on 'apim_abnormal_backend_time_alert_0' # Line: 34. Position: 111, near '#store(type = 'rdbms', datasource = 'APIM_ANALYTICS_DB')
define table ApimAllAlert (type string, tenantDomain string, message string, severity int, alertTimestamp long)'. No extension exist for store:rdbms org.wso2.carbon.stream.processor.core.internal.exception.SiddhiAppDeploymentException: org.wso2.siddhi.core.exception.SiddhiAppCreationException: Error on 'apim_abnormal_backend_time_alert_0' # Line: 34. Position: 111, near '#store(type = 'rdbms', datasource = 'APIM_ANALYTICS_DB')
define table ApimAllAlert (type string, tenantDomain string, message string, severity int, alertTimestamp long)'. No extension exist for store:rdbms
at org.wso2.carbon.stream.processor.core.internal.StreamProcessorDeployer.deploySiddhiQLFile(StreamProcessorDeployer.java:105)
at org.wso2.carbon.stream.processor.core.internal.StreamProcessorDeployer.deploy(StreamProcessorDeployer.java:306)
at org.wso2.carbon.deployment.engine.internal.DeploymentEngine.lambda$deployArtifacts$0(DeploymentEngine.java:291)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.wso2.carbon.deployment.engine.internal.DeploymentEngine.deployArtifacts(DeploymentEngine.java:282)
at org.wso2.carbon.deployment.engine.internal.RepositoryScanner.sweep(RepositoryScanner.java:112)
at org.wso2.carbon.deployment.engine.internal.RepositoryScanner.scan(RepositoryScanner.java:68)
at org.wso2.carbon.deployment.engine.internal.DeploymentEngine.start(DeploymentEngine.java:121)
at org.wso2.carbon.deployment.engine.internal.DeploymentEngineListenerComponent.onAllRequiredCapabilitiesAvailable(DeploymentEngineListenerComponent.java:216)
at org.wso2.carbon.kernel.internal.startupresolver.StartupComponentManager.lambda$notifySatisfiableComponents$7(StartupComponentManager.java:266)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.wso2.carbon.kernel.internal.startupresolver.StartupComponentManager.notifySatisfiableComponents(StartupComponentManager.java:252)
at org.wso2.carbon.kernel.internal.startupresolver.StartupOrderResolver$1.run(StartupOrderResolver.java:204)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Caused by: org.wso2.siddhi.core.exception.SiddhiAppCreationException: Error on 'apim_abnormal_backend_time_alert_0' # Line: 34. Position: 111, near '#store(type = 'rdbms', datasource = 'APIM_ANALYTICS_DB')
define table ApimAllAlert (type string, tenantDomain string, message string, severity int, alertTimestamp long)'. No extension exist for store:rdbms
at org.wso2.siddhi.core.util.SiddhiClassLoader.loadExtensionImplementation(SiddhiClassLoader.java:45)
at org.wso2.siddhi.core.util.parser.helper.DefinitionParserHelper.addTable(DefinitionParserHelper.java:203)
at org.wso2.siddhi.core.util.SiddhiAppRuntimeBuilder.defineTable(SiddhiAppRuntimeBuilder.java:125)
at org.wso2.siddhi.core.util.parser.SiddhiAppParser.defineTableDefinitions(SiddhiAppParser.java:320)
at org.wso2.siddhi.core.util.parser.SiddhiAppParser.parse(SiddhiAppParser.java:224)
at org.wso2.siddhi.core.SiddhiManager.createSiddhiAppRuntime(SiddhiManager.java:65)
at org.wso2.siddhi.core.SiddhiManager.createSiddhiAppRuntime(SiddhiManager.java:74)
at org.wso2.carbon.stream.processor.core.internal.StreamProcessorService.deploySiddhiApp(StreamProcessorService.java:100)
at org.wso2.carbon.stream.processor.core.internal.StreamProcessorDeployer.deploySiddhiQLFile(StreamProcessorDeployer.java:93)
... 14 more
And many more like this for each alert types.
Any help in regards to this is appreciated.
Thanks
The tables needed will be created in most of the cases, exceptions are following data sources and those needs to be created only if you are using the specific functionality,
1. Metrics DB
2. Microgateway analytics db
However, seems the issue you are facing is the server is not recognising siddhi-store-rdbms.jar packed in /lib folder. Please check if it is available. It is packed by default.
Niveathika,
We are currently not using microgateway functionality so I really don't know if I need to have that database populated with schema but what I found was that I have to have two database schema populated WSO2_APIM_ANALYTICS_GEO_LOCATION_DATA and WSO2_APIM_ANALYTICS_DASHBOARD I found schema for WSO2_APIM_ANALYTICS_DASHBOARD in stream processor server
Here are those two schema for someone like me struggling to migrate over to MSSQL
WSO2_APIM_ANALYTICS_DASHBOARD
IF OBJECT_ID('[dbo].[DASHBOARD_RESOURCE]', 'U') IS NOT NULL
DROP TABLE [dbo].[DASHBOARD_RESOURCE]
GO
CREATE TABLE [dbo].[DASHBOARD_RESOURCE](
[ID] [int] IDENTITY(1,1) NOT NULL,
[URL] [varchar](100) NOT NULL,
[OWNER] [varchar](100) NOT NULL,
[NAME] [varchar](256) NOT NULL,
[DESCRIPTION] [varchar](1000) NULL,
[PARENT_ID] [int] NOT NULL,
[LANDING_PAGE] [varchar](100) NOT NULL,
[CONTENT] [varbinary](max) NULL,
CONSTRAINT [PK_DASHBOARD_RESOURCE] PRIMARY KEY CLUSTERED
(
[URL] ASC,
[OWNER] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
IF OBJECT_ID('[dbo].[WIDGET_RESOURCE]', 'U') IS NOT NULL
DROP TABLE [dbo].[WIDGET_RESOURCE]
GO
CREATE TABLE [dbo].[WIDGET_RESOURCE](
[WIDGET_ID] [varchar](255) NOT NULL,
[WIDGET_NAME] [varchar](255) NOT NULL,
[WIDGET_CONFIGS] [varbinary](8000) NULL,
CONSTRAINT [PK_WIDGET_RESOURCE] PRIMARY KEY CLUSTERED
(
[WIDGET_ID] ASC,
[WIDGET_NAME] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
WSO2_APIM_ANALYTICS_GEO_LOCATION_DATA
CREATE TABLE BLOCKS (
network_cidr varchar(45) DEFAULT NULL,
network BIGINT DEFAULT NULL,
broadcast BIGINT DEFAULT NULL,
geoname_id BIGINT DEFAULT NULL,
registered_country_geoname_id BIGINT DEFAULT NULL,
represented_country_geoname_id BIGINT DEFAULT NULL,
is_anonymous_proxy SMALLINT DEFAULT '0',
is_satellite_provider SMALLINT DEFAULT '0',
postal_code VARCHAR(45) DEFAULT NULL,
latitude DECIMAL(10,4) DEFAULT NULL,
longitude DECIMAL(10,4) DEFAULT NULL,
network_blocks varchar(45) DEFAULT NULL);
CREATE INDEX idx_blocks_network ON BLOCKS (network);
CREATE INDEX idx_blocks_broadcast ON BLOCKS (broadcast);
CREATE INDEX idx_blocks_network_blocks ON BLOCKS (network_blocks);
CREATE TABLE LOCATION (
geoname_id BIGINT NOT NULL,
locale_code VARCHAR(10) DEFAULT NULL,
continent_code VARCHAR(10) DEFAULT NULL,
continent_name VARCHAR(20) DEFAULT NULL,
country_iso_code VARCHAR(10) DEFAULT NULL,
country_name VARCHAR(45) DEFAULT NULL,
subdivision_1_iso_code VARCHAR(10) DEFAULT NULL,
subdivision_1_name VARCHAR(1000) DEFAULT NULL,
subdivision_2_iso_code VARCHAR(10) DEFAULT NULL,
subdivision_2_name VARCHAR(1000) DEFAULT NULL,
city_name VARCHAR(1000) DEFAULT NULL,
metro_code BIGINT DEFAULT NULL,
time_zone VARCHAR(30) DEFAULT NULL,
PRIMARY KEY (geoname_id));
CREATE TABLE IP_LOCATION (
ip VARCHAR(100) NOT NULL,
country_name VARCHAR(200) DEFAULT NULL,
city_name VARCHAR(200) DEFAULT NULL,
PRIMARY KEY (ip)
);
Thanks

How do I create composite key using loopback4?

CREATE TABLE dbo.Users (
id int NOT NULL IDENTITY(1,1),
uid int NOT NULL,
username nvarchar(65) NULL,
password varchar(100) NULL,
firstname nvarchar(50) NULL,
lastname nvarchar(50) NULL,
);
ALTER TABLE dbo.Users ADD CONSTRAINT PK_Users PRIMARY KEY (id, uid);
This is my sql query and want to create data model for this schema using loopback4.
Unfortunately, it doesn't seem that this feature is currently supported. An open issue tracking this feature can be found on their GitHub page here.

Complex SQL syntax

I have a game, and in the database I'm saving the user actions by date & time.
CREATE TABLE user_actions
(
aId BIGSERIAL PRIMARY KEY NOT NULL,
userId BIGINT NOT NULL REFERENCES users(uId) DEFERRABLE INITIALLY DEFERRED,
aDate TIMESTAMP without time zone DEFAULT now(),
aType INTEGER NOT NULL DEFAULT 0
);
My users are identified with email
CREATE TABLE users(
uId BIGSERIAL PRIMARY KEY NOT NULL,
uName VARCHAR (50) NOT NULL,
uEmail VARCHAR (75) UNIQUE NULL
);
and each day new prizes are added each day has a different number of prizes
CREATE TABLE prizes(
pId BIGSERIAL PRIMARY KEY NOT NULL,
pDate TIMESTAMP without time zone DEFAULT now(),
pType INTEGER NULL
pSize INTEGER NULL
);
This query list the userId and his last action date, per user
select distinct userId, max(aDate) from user_actions GROUP BY userId order by userId;
I want to create a query that will count the number of prizes added since each user last action.
I'm running:
OS: Debian
DB: Postgresql
code: Django
I think I will use CTE though It has not been tested
WITH last_actions AS (
SELECT DISTINCT userId, MAX(aDate) as last_logged
FROM user_actions
GROUP BY userId ORDER BY userId)
SELECT a.userId, COUNT(b.pDate)
FROM last_actions a, prizes b
WHERE b.pDate >= a.last_logged
GROUP BY a.userId;

How to update and set unit_id to be NULL, unit id is foreign key from unit table

How to update and set unit_id to be NULL, unit id is foreign key and I need to set to be NULL
CREATE TABLE troops
(
id serial NOT NULL,
unit_id integer,
type integer NOT NULL,
level integer NOT NULL,
CONSTRAINT troops_pkey PRIMARY KEY (id),
CONSTRAINT troops_unit_id_fkey FOREIGN KEY (unit_id)
REFERENCES units (id) MATCH SIMPLE
ON UPDATE NO ACTION ON DELETE CASCADE
)
WITH (
OIDS=FALSE
);
I have tried to update troops and set unit_id to be NULL ( I put NULL not 0 in pqxx statement in C++ ) but I get error like
insert or update on table "troops" violates foreign key constraint "troops_unit_id_fkey"
DETAIL: Key (unit_id)=(0) is not present in table "units".
When I try from pgadmin I can set unit_id to be NULL in troops but pqxx ( from c++ code I try to update) converts NULL to 0, how to solve this ?