How to start apollo federation server only when all services are available - apollo

I want to start a federated apollo server:
const gateway = new ApolloGateway({
serviceList: [
... list of services
],
});
const startServer = async () => {
const gatewayConfig = await gateway.load();
const server = new ApolloServer({
...gatewayConfig,
subscriptions: false,
});
server.listen().then(({ url }) => {
console.log("Server running!");
});
};
startServer();
When I start the server and one of the services in the serviceList is available, the server starts and logs which services have failed. I want the server to only start when all the services are available, ie when one service is unavailable an error is thrown and the server stops. Any ideas how to do this?

Apollo can't do this as of writing this answer. The only solution is to monitor the availability manually and leverage apollo accordingly. I used apollo-server-express for this.
Below is a demonstration of how I managed to leverage my apollo gateway based on the availability of my services.
Basically, you wrap the middleware of your apollo server. This allows you to exchange your apollo server instance as well as throwing an error when they are not available.
import express from 'express';
import { ApolloServer } from 'apollo-server-express';
import bodyParser from 'body-parser'; // use express body-parser for convinience
// your list of federated services
const serviceList = [
{ name: 'service1', url: 'http://service1/graphql' }
];
// a variable to store the server. We will need to replace him when a service goes offline or comes back online again
let server = null;
// setup express
const app = express();
app.use(bodyParser.json());
app.use(customRouterToLeverageApolloServer); // defined below
// middleware to leverage apollo server
function customRouterToLeverageApolloServer(req, res, next) {
// if services are down (no apollo instance) throw an error
if(!server) {
res.json({ error: 'services are currently not available' });
return;
}
// else pass the request to apollo
const router = server.getMiddleware(); // https://www.apollographql.com/docs/apollo-server/api/apollo-server/#apolloservergetmiddleware
return router(req, res, next);
}
function servicesAreAvailable() {
// go through your serviceList and check availability
}
// periodically check the availability of your services and create/destroy an ApolloServer instance accordingly. This will also be the indication whether or not your services are available at the time.
// you might want to also call this function at startup
setInterval(() => {
if(servicesAreAvailable()) {
server = new ApolloServer({ ... });
}
else {
server = null;
}
}, 1000 * 60 * 5) // check every 5 minutes

Related

Google cloud function with different end point sheducle with google

I have created a project in express
const express = require('express');
const app = express();
const PORT = 5555;
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
app.get('/tr', (req, res, next) => {
res.json({ status: 200, data: 'tr' })
});
app.get('/po', (req, res, next) => {
res.json({ status: 200, data: 'po' })
});
module.exports = {
app
};
deployed on cloud function with name my-transaction
and i am scheduling with google clound giving the url like
http://url/my-transaction/po
When I deployed without authentiation scheduler runs job success, but when I do with authentication it fails.
similary if i create a sample project like below
exports.helloHttp = (req, res) => {
res.json({ status: 200, data: 'test hello' })
};
and deploy similary configuring same as above with authentication it works.
only differce is in last function name is similar to entry point means
while above entry point is app with different end points.
any help,
appreciated
Thanks
This is because you need to add auth information to your http requests on cloud Scheduler
First you need to create a service account with the role Cloud Functions Invoker
when you have created the service account, you can see that has a email associated fro example:
cfinvoker#fakeproject.iam.gserviceaccount.com
After that you can create a new scheduler job with auth information by following these steps:
Select target http
Write the url (cloud function url)
Click on "show more"
Select Auth header > Add OIDC token
Write the full email address of the service account
This new job scheduler will be send the http request with the auth infromation to execute successfully your cloud function.

aws xray not monitoring dynamo dax client in nodejs

I recently started using dynamodb dax within my node lambda function, however with 'amazon-dax-client' framework, i cannot longer capture transparently with http requests made by the framework, like so;
const AWS = AWSXRay.captureAWS(require('aws-sdk'));
const dynamoDB = AWSXRay.captureAWSClient(new AWS.DynamoDB(defaults.db.config));
I know i could create an async capture. but i am wondering if there is a better way to do this, like in the previous way and if i someone managed to capture requests, made with dax-client in the same way as with the dynamo client from aws framework.
DAX doesn't currently support XRay, because DAX doesn't use the standard AWS SDK HTTP client (it doesn't use HTTP at all).
The team has received other requests for XRay support so it's certainly something we're considering.
While there is no official support for XRAY from DAX team. I wrote a little snippet as a workaround.
const db = new Proxy(documentClient, {
get: (target: any, prop: any) => {
if (typeof target[prop] === "function") {
return (...args: any) => {
const segment = xray
.getSegment()
?.addNewSubsegment(`DynamoDB::${prop}`);
const request = target[prop](...args);
const promise = request
.promise()
.then((response: any) => {
segment?.close();
return response;
})
.catch((err: any) => {
segment?.close();
return Promise.reject(err);
});
return {
...request,
promise: () => promise,
};
};
}
return target[prop];
},
});
const response await = db.query(...).promise();
Tested in AWS Lambda under VPC private subnet and AWS XRAY service endpoint.

Using Google Cloud Functions API as Service Discovery

We're working on developing a microservice based architecture employing Google Cloud Functions.
We've developed a few functions and want to implement a discovery service. This discovery service would be used to determine if a specific function exists and is operational.
The service discovery itself is a cloud function. It makes a rest request to the below API and succeeds in local development using functions emulator and the default application credentials.
Google provides an API for this [https://cloud.google.com/functions/docs/reference/rest/v1beta2/projects.locations.functions/get][1]
When deployed to production we're receiving:
{ "code": 401, "message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.", "status": "UNAUTHENTICATED" } }
Cloud functions are stateless so there's no option to use a service account that I can see. How do I go about authenticating a cloud function to call the functions api to determine if a function is available?
Below is how we've accomplished this in a local dev environment:
var options = {
method: 'get',
uri: `https://cloudfunctions.googleapis.com/v1beta2/projects/${config.PROJECT_ID}/locations/${config.LOCATION_ID}/functions/${functionName}`
}
console.log (options.uri);
request(options, function (err, res, body) {
if (!err && res.status === 200) {
if(typeof res.body.httpsTrigger.url !== undefined) {
console.log('found a function');
return cb(false, res.body.httpsTrigger.url);
}
else {
console.log('no function found, looking for static content');
return cb(true, `Service doesn't exist, returned ${res.status}`)
}
}
else {
console.log('no function found, looking for static content');
return cb(true, `Service doesn't exist, returned ${res.status}`);
}
});
So I finally figured out how to do this. It's a bit of hack:.
Download the JSON Key file for the Service Account.
Add the JSON file to the Function source.
Install the google-auth-library NPM module.
Modify the request to use the client from google-auth-library
const keys = require('./VirtualAssistantCred.json');
const {auth} = require('google-auth-library');
const client = auth.fromJSON(keys);
client.scopes = ['https://www.googleapis.com/auth/cloud-platform'];
client.authorize().then(function () {
const url = https://cloudfunctions.googleapis.com/v1beta2/projects/${config.PROJECT_ID}/locations/${config.LOCATION_ID}/functions/${functionName};
client.request({url}, function(err, res) {
if (err) {
console.log(Error: ${err});
return cb(true, Service, ${functionName} doesn't exist, returned ${res})
} else {
console.log(RESPONSE: ${JSON.stringify(res.data)});
console.log(Found Service at ${res.data.httpsTrigger.url});
return cb(false, res.data.httpsTrigger.url);
}
});
}
);
The API you mention contains method: projects.locations.functions.get , that requires one of the following OAuth scopes:
https://www.googleapis.com/auth/cloudfunctions
https://www.googleapis.com/auth/cloud-platform
You may have a look at the Authentication in HTTP Cloud Functions online document.

Setting Up Apollo Server with subscriptions-transport-ws?

It seems like I have my server set up according to the Apollo docs at http://dev.apollodata.com/tools/apollo-server/setup.html. In my server/main.js file:
//SET UP APOLLO INCLUDING APOLLO PUBSUB
const executableSchema = makeExecutableSchema({
typeDefs: Schema,
resolvers: Resolvers,
connectors: Connectors,
logger: console,
});
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
// `context` must be an object and can't be undefined when using connectors
graphQLServer.use('/graphql', bodyParser.json(), apolloExpress({
schema: executableSchema,
context: {}, //at least(!) an empty object
}));
graphQLServer.use('/graphiql', graphiqlExpress({
endpointURL: '/graphql',
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}/graphql`
));
//SET UP APOLLO INCLUDING APOLLO PUBSUB
It prints out "GraphQL Server is now running on http://localhost:8080/graphql" to the terminal log indicating that the server was successfully initialized.
But at the top of my main_layout component, when I run this code:
import { Client } from 'subscriptions-transport-ws';
const wsClient = new Client('ws://localhost:8080');
...I get this console message:
WebSocket connection to 'ws://localhost:8080/' failed: Connection closed before receiving a handshake response
What am I missing?
You need to create a dedicated websocket server. It will run on a different port and the code to set it up is provided on the subscriptions-transport-ws package.
Take a look on the following code from GitHunt-API example:
https://github.com/apollostack/GitHunt-API/blob/master/api/index.js#L101-L134
Also you would see that this code is dependent on a class called SubscriptionManager. It is a class from a package called graphql-subscriptions also by the apollo team, and you can find an example of how to use it here:
https://github.com/apollostack/GitHunt-API/blob/master/api/subscriptions.js
TL;DR: You can use graphql-up to quickly get a GraphQL server with subscriptions support up and ready. Here's a more detailed tutorial on using this in combination with Apollo and the websocket client subscriptions-transport-ws.
Obtain a GraphQL Server with one click
Let's say you want to build a Twitter clone based on this GraphQL Schema in IDL syntax:
type Tweet {
id: ID!
title: String!
author: User! #relation(name: "Tweets")
}
type User {
id: ID!
name: String!
tweets: [Tweet!]! #relation(name: "Tweets")
}
Click this button to receive your own GraphQL API and then open the Playground, where you can add some tweets, query all tweets and also test out subscriptions.
Simple to use API
First, let's create a user that will be the author for all coming tweets. Run this mutation in the Playground:
mutation createUser {
createUser(name: "Tweety") {
id # copy this id for future mutations!
}
}
Here's how you query all tweets and their authors stored at your GraphQL server:
query allTweets {
allTweets {
id
title
createdAt
author {
id
name
}
}
}
Subscription support using websockets
Let's now subscribe to new tweets from "Tweety". This is the syntax:
subscription createdTweets {
Message(filter: {
mutation_in: [CREATED]
node: {
author: {
name: "Tweety"
}
}
}) {
node {
id
text
createdAt
sentBy {
id
name
}
}
}
}
Now create a new tab in the Playground and create a new Tweet:
mutation createTweet {
createTweet(
title: "#GraphQL Subscriptions are awesome!"
authorId: "<id-from-above>"
) {
id
}
}
You should see a new event popping up in your other tab where you subscribed before.
Here is a demo about using Apollo GraphQL, React & Hapi: https://github.com/evolastech/todo-react. It's less overwhelmed than GitHunt-React & GitHunt-API
Seems like you aren't actually making the websocket server. use SubscriptionServer. Keep in mind that it is absolutely NOT true that you have to have a dedicated websocket port (I thought this once too) as davidyaha says. I have both my normal queries and subs on the same port.
import { createServer } from 'http';
import { SubscriptionServer } from 'subscriptions-transport-ws';
import { execute, subscribe } from 'graphql';
import { schema } from './my-schema';
// All your graphQLServer.use() etc setup goes here, MINUS the graphQLServer.listen(),
// you'll do that with websocketServer:
// Create WebSocket listener server
const websocketServer = createServer(graphQLServer);
// Bind it to port and start listening
websocketServer.listen(3000, () => console.log(
`Server is now running on http://localhost:3000`
));
const subscriptionServer = SubscriptionServer.create(
{
schema,
execute,
subscribe,
},
{
server: websocketServer,
path: '/subscriptions',
},
);

Loopback soap connector not populating the model with the service methods

I am having trouble consuming soap web service using loopback-connector-soap. I followed the instructions from github and here.
I am able to consume the web service provided in the example but when I replace it with my wsdl it doesn't work. There are no errors after connecting but after creating the model, calling any method exposed by the web service fails with the error message
Object function ModelConstructor(data, options) { if (!(this instanceof ModelConstructor)) { return new ModelConstructor(data, options); } if (ModelClass.settings.unresolved) { throw new Error('Model ' + ModelClass.modelName + ' is not defined.'); } ModelBaseClass.apply(this, arguments); } has no method 'GetAgentBalance'
My code snippet is as shown below
var api = {};
var ds = loopback.createDataSource('soap', {
connector: 'loopback-connector-soap',
remotingEnabled: true,
wsdl: 'https://jambopay.com/agencyservices?WSDL',// The url to WSDL
wsdl_options: {
rejectUnauthorized: false,
strictSSL: false,
requestCert: true,
}
});
// Unfortunately, the methods from the connector are mixed in asynchronously
// This is a hack to wait for the methods to be injected
ds.once('connected', function () {
// Create the model
var service = ds.createModel('AgencyService', {});
api.getBalance = function (callback) {
//Displays the parsed WSDL
console.log(ds.connector);
var timestamp = moment().format('YYYY-MM-DD HH:mm:ss');
var pass = crypto.createHash('sha1').update(apiUsername + timestamp + apiKey).digest("hex");
//GetAgentBalance is a method in the web service.
service.GetAgentBalance({ username: apiUsername, timestamp: timestamp, pass: pass }, function (err, response) {
var result = (!err && response.Balance) ? response.Balance : null;
callback(err, result);
});
}
});
module.exports = api;
The console.log(ds.connector) displays the wsdl content plus a flag parsed:true meaning it was parsed successfully.
Kindly help me figure out why the model is not being populated with the methods from the web service which is preventing the consumption of the web service. Please note that this web service works fine in ASP.NET,Java as well as PHP and I guess several other languages and frameworks. I'm stuck at this point and your assistance is highly appreciated.