I'm quite new to development with Google App engine and other Google services of the Cloud platform and I'd like to create an app with different modules (so they can have their own lifecycle) which use endpoints.
I'm struggling with api paths because I don't know how to route requests to the good module.
My directory tree is like that:
/myApp
/module1
__init__.py
main.py
/module2
__init__.py
main.py
module1.yaml
module2.yaml
dispatch.yaml
module1.yaml
application: myapp
runtime: python27
threadsafe: true
module: module1
version: 0
api_version: 1
handlers:
# The endpoints handler must be mapped to /_ah/spi.
# Apps send requests to /_ah/api, but the endpoints service handles mapping
# those requests to /_ah/spi.
- url: /_ah/spi/.*
script: module1.main.api
libraries:
- name: pycrypto
version: 2.6
- name: endpoints
version: 1.0
module2.yaml
application: myapp
runtime: python27
threadsafe: true
module: module2
version: 0
api_version: 1
handlers:
# The endpoints handler must be mapped to /_ah/spi.
# Apps send requests to /_ah/api, but the endpoints service handles mapping
# those requests to /_ah/spi.
- url: /_ah/spi/.*
script: module2.main.api
libraries:
- name: pycrypto
version: 2.6
- name: endpoints
version: 1.0
dispatch.yaml
dispatch:
- url: "*/_ah/spi/*"
module: module1
- url: "*/_ah/spi/.*"
module: module2
So I'd like my endpoints to be called with the name of the corresponding module somewhere ('_ah/api/module1' or 'module1/_ah/api'). I don't know what to put in the different .yaml files. I don't even know if what I'm doing is right, or possible.
Thanks for your answers.
You can host different endpoints on different modules (now called services); the way to correctly address them is as follows:
https://<service-name>-dot-<your-project-id>.appspot.com/_ah/api
Now, let's say you have—as per your description—module1 and module2, each one hosting different endpoints. You will call module1 APIs by hitting:
https://module1-dot-<your-project-id>.appspot.com/_ah/api
And in a similar fashion, module2 APIs:
https://module2-dot-<your-project-id>.appspot.com/_ah/api
If you want to dig deeper into how this URL schema works (including versions, which are another important part of the equation here), go read Addressing microservices and the immediately following section Using API versions
Related
I'm facing a very common issue on Google cloud after deployment. I'm getting Cannot find module '/workspace/server.js'. In local it's working fine but not on gcloud. Tried lot of solutions from google but not able to get it. Please do provide me if any suggestion to make it work.
Here is my directory and config:
Build directory
Yaml config
runtime: nodejs16 # or another supported version
instance_class: B1
service: adminoper
basic_scaling:
max_instances: 1
idle_timeout: 10m
handlers:
- url: /
static_dir: build
- url: /.*
secure: always
redirect_http_response_code: 301
script: auto
Google cloud error
I got few answers in google to add entrypoint path but that's not working. Please do suggest.
I have created a serverless aws-nodejs template project and in it I have organized my js files in the following way -
project:root -
| .env
| src -
| controllers -
<js_files_here>
| helpers -
<js_files_here>
| models -
<js_files_here>
| routes -
<yml_files_here>
And this is my serverless.yml -
service: rest-api
provider:
name: aws
runtime: nodejs8.10
stage: dev
region: ap-south-1
plugins:
- serverless-bundle
- serverless-offline
functions:
${file(./src/routes/index.yml)}
and in one of my js files I am trying to use -
require('dotenv').config({ path: './.env' });
So I am trying to load some of the environment variables from this .env file. Now this is working as expected when I test these files locally with - sls offline start
but when I deploy them to a aws account, the apis stop working as expected and also when I see the package (rest-api.zip) file in the .serverless directory I do not see all the files from src directory packaged in there.
So, how do I fix this issue and deploy my project correctly with serverless on aws ?
Your problem is that WebPack failed to include the file in its transitive closure when trying to find out all the files you need, due to dotenv importing it dynamically.
You could use a WebPack plugin to explicitly include your env file, such as this one.
So I uploaded this layer to AWS using the Serverless framework:
service: webstorm-layer
provider:
name: aws
runtime: nodejs8.10
region: us-east-1
layers:
nodejs:
path: nodejs # path to contents on disk
name: node-webstormlibs # optional, Deployed Lambda layer name
description: JS shared libs for node
compatibleRuntimes:
- nodejs8.10
allowedAccounts:
- '*'
The libraries I need are inside the "nodejs" directory, there lays my packages.json file and all the "node_modules" directories. So far all looks fine, but when I try to run a lambda that uses the "node-webstormlibs" layer, I'm getting the message:
"errorMessage": "Cannot find module 'pg'",
The pg module actually exists in the zip file that creates the layer. Then, I have doubts about how to import a module that is inside the layer. In some tutorials I see:
import pg from "pg";
like always, but in other I see:
import pg from "/opt/pg";
or even:
import pg from "/opt/nodejs/node_modules/pg";
I don't know if the "path:" option in my serverless.yml is correct though.
In the server the path is:
NODE_PATH=/opt/nodejs/node8/node_modules/:/opt/nodejs/node_modules:$LAMBDA_RUNTIME_DIR/node_modules
UPDATE
Putting all in the dir /nodejs/node8, made the trick.
I have the following project tree
Where nodejs folder is a lambda layer defined in the following serverless.yaml
service: aws-nodejs # NOTE: update this with your service name
provider:
name: aws
runtime: nodejs8.10
stage: dev
plugins:
- serverless-offline
layers:
layer1:
path: nodejs # required, path to layer contents on disk
name: ${self:provider.stage}-layerName # optional, Deployed Lambda layer name
functions:
hello:
handler: handler.hello
layers:
- {Ref: Layer1LambdaLayer}
events:
- http:
path: /dev
method: get
The layer1 only contains UUID package.
So when I try to run the lambda locally using serverless offline plugin, it says can't find module UUID.
But when I deploy the code to AWS, it run like a charm.
Any way we can get lambda layers running locally for testing purpose? and for speeding up the development?
Or is there any way where I can dynamically set the node_module path to point to the layer folder during the development and once I need to push to production, it change the path to the proper one
Ok after many trials, I figure out a working solution
I added a npm run command which export a temporary node_module path to the list of paths
"scripts": {
"offline": "export NODE_PATH=\"${PWD}/nodejs/node_modules\" && serverless offline"
},
So, node can lookup for the node modules inside the sub folders
I got around this by running serverless-offline in a container and copying my layers into the /opt/ directory with gulp. I set a gulp watch to monitor any layer changes and to copy them to the /opt/ directory.
I use layers in serverless offline via installing a layer from local file system as a dev dependency.
npm i <local_path_to_my_layer_package> --save-dev
BTW this issue was fixed in sls 1.49.0.
Just run:
sudo npm i serverless
Then you should specify package include in serverless.yml's layer section
service: aws-nodejs # NOTE: update this with your service name
provider:
name: aws
runtime: nodejs8.10
stage: dev
plugins:
- serverless-offline
layers:
layer1:
path: nodejs # required, path to layer contents on disk
package:
include:
- node_modules/**
name: ${self:provider.stage}-layerName # optional, Deployed Lambda layer name
functions:
hello:
handler: handler.hello
layers:
- {Ref: Layer1LambdaLayer}
events:
- http:
path: /dev
method: get
Tested on nodejs10.x runtime
I don't think that my gae python app has a flexible environment because I created it many years ago. Now I want to try and create a module that has another runtime than python and keep the python app running python alongside a new runtime, custom or just another. Maybe mix PHP and python or similar. I don't need it but I want to learn and explore the possibilities. I'm also interested in learning Erlang and deploy Erlang code with appengine. I see there is questions about it already
erlang on google app engine?
And issue 125 in the tracker.
But how should we actually do it? If we make our own runtime provided that is allowed.
My app.yaml looks like
application: montaoproject
version: newsearch
runtime: python27
api_version: 1
threadsafe: true
module: default
instance_class: F1
automatic_scaling:
min_idle_instances: 5
max_idle_instances: automatic
min_pending_latency: automatic
max_pending_latency: 30ms
max_concurrent_requests: 50
default_expiration: "14d 5h"
env_variables:
GAE_USE_MONTAO : 'anyvalue'
KOOL_VERSION : '17a'
includes:
- br.yaml # Brazil
- in.yaml # India
- us.yaml # USA
- pk.yaml
- search.yaml # search pages
- admin.yaml # admin pages
- providers.yaml # auth providers
- statics.yaml # static content
handlers:
- url: /(business|ai|newindia|insert-ad.html)
script: montao.app
- url: /blobview.*
script: kool_update.app
login: admin
- url: /market.*
script: main.app
- url: /
script: montao.app
- url: /(index.html|sign-up.html|login.html)
script: montao.app
- url: /(login.*|login|googlogin|googlogout|create/)
script: login.app
- url: /(customer_service.htm|contactfileupload|support.html|faq.html)
script: customer_service.app
- url: /stats.*
script: google.appengine.ext.appstats.ui.app
# All other URLs use main.app
- url: /.*
script: main.app
inbound_services:
- mail
builtins:
- remote_api: on
- deferred: on
#- appstats: on
error_handlers:
- file: default_error.html
libraries:
- name: webapp2
version: latest
- name: jinja2
version: latest
- name: setuptools
version: latest
- name: markupsafe
version: latest
- name: django
version: latest
- name: PIL
version: latest
- name: webob
version: latest
- name: lxml
version: latest
- name: ssl
version: latest
Yes, your app.yaml file is a standard env one (it doesn't have vm:true or env:flex in it).
Yes, it's possible to mix and match services/modules in different languages and with different environments inside the same app. You can even switch the language and environment of the same module in a different version of that module. That's because modules offer complete code isolation, see Comparison of service isolation and project isolation. Related post: Upload a Java and node.js project to Google AppEngine at once
I always try to structure a multi-service/module app with each service in its own subdir, as described in Can a default service/module in a Google App Engine app be a sibling of a non-default one in terms of folder structure?
So first I'd create a default subdirectory of your app dir and move all your existing default module specific files into it, with the exception of the app-level config, which I'd keep at the top level and symlink inside the default dir as described in that post. Then I'd verify that the default module still works as expected.
Then I'd create a new subdirectory for every new module I need to add and add the code for it as needed.
Side note: sharing code via symlinks as described in the post mentioned above works for standard env modules, but it probably doesn't work with flexible ones, see Sharing code between modules in a GAE project