I am trying to develop an Alexa skill, that fetches information from a DynamoDB database. In order to use that I have to import the aws-sdk.
But for some reason when I import it, my skill stops working. The skill does not even open. My code is hosted from the Alexa Developer Console.
Here's what happens:
In the testing panel, when I input 'Open Cricket Update' (the app name), Alexa's response is, 'There was a problem with the requested skill's response'.
This happens only when I import the aws-sdk.
What am I doing wrong?
index.js
const Alexa = require('ask-sdk-core');
const AWS = require('aws-sdk');
AWS.config.update({region:'us-east-1'});
const table = 'CricketData';
const docClient = new AWS.DynamoDB.DocumentClient();
const LaunchRequestHandler = {
canHandle(handlerInput) {
return Alexa.getRequestType(handlerInput.requestEnvelope) === 'LaunchRequest';
},
handle(handlerInput) {
const speakOutput = 'Hello! Welcome to cricket update.';
return handlerInput.responseBuilder
.speak(speakOutput)
.reprompt(speakOutput)
.getResponse();
}
};
package.json
{
"name": "hello-world",
"version": "1.1.0",
"description": "alexa utility for quickly building skills",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Amazon Alexa",
"license": "ISC",
"dependencies": {
"ask-sdk-core": "^2.6.0",
"ask-sdk-model": "^1.18.0",
"aws-sdk": "^2.326.0"
}
}
You are missing the exports.handler block at the end of your index.js that "builds" the skill composed from your handlers, e.g.
exports.handler = Alexa.SkillBuilders.custom()
.addRequestHandlers(LaunchRequestHandler)
.lambda();
A more complete example can be found here
Related
After running expo install expo-firebase-core expo-firebase-analytics and downloading both google-services.json and GoogleService-Info.plist from firebase console and placing them on the root of my project.
When i call Analytics.logEvent, expo go gives an error.
Possible Unhandled Promise Rejection (id: 0):
Error: Firebase is not configured. Ensure that you have configured 'google-services.json' correctly.
this is my TopLevelComponent.js:
import React from 'react'
import * as Analytics from 'expo-firebase-analytics';
import { createRootNavigator } from './router'
const RootNavigator = createRootNavigator()
const TopLevelComponent = props => {
const { screenProps } = props;
const { checkLogin } = screenProps;
const getActiveRouteName = navigationState => {
if (!navigationState) {
return null
}
const route = navigationState.routes[navigationState.index]
// Parse the nested navigators
if (route.routes) return getActiveRouteName(route)
return route.routeName
}
return (
<RootNavigator
onNavigationStateChange={async (prevState, currentState) => {
const currentScreen = getActiveRouteName(currentState)
const prevScreen = getActiveRouteName(prevState)
if (prevScreen !== currentScreen) {
checkLogin()
Analytics.logEvent('event')
}
}}
screenProps={props.screenProps}
/>
);
}
export default TopLevelComponent
Am i missing any other config?
Is there any other way to configure firebase-analytics besides this files?
I'm using expo-44.0.6 and expo-firebase-analytics-6.0.1
I had the same error.
This is how I fixed it:
Go to app.js and add
"googleServicesFile": "./GoogleService-Info.plist"
under the "iOS" section.
example:
"expo": {
"name": "",
"slug": "",
"version": "",
"orientation": "",
"icon": "",
"splash": {
"image": "",
"resizeMode": "",
"backgroundColor": ""
},
"updates": {
"fallbackToCacheTimeout":
},
"assetBundlePatterns": [
"**/*"
],
"ios": {
"supportsTablet":,
"bundleIdentifier": "",
"googleServicesFile": "./GoogleService-Info.plist"
},
Similar for Android:
"android": {
"googleServicesFile": "./google-services.json",
"adaptiveIcon": {
"foregroundImage": "",
"backgroundColor": ""
}
Add this for under the "web" section:
"web": {
"config": {
"firebase": {
"apiKey": "",
"authDomain": "",
"projectId": "",
"storageBucket": "",
"messagingSenderId": "",
"appId": "",
"measurementId": "G-**********"
}
},
"favicon": "./assets/favicon.png"
}
Then in the app:
import * as Analytics from 'expo-firebase-analytics';
const pageView = async (routeName, params) => { await Analytics.logEvent(routeName, params); };
I had same mistake. In my case i was using the Expo Bare Workflow with the SDK 45.
I only add these line in my android/build.gradle
dependencies {
classpath("com.android.tools.build:gradle:4.1.0")
classpath 'com.google.gms:google-services:4.3.10' /* Add this line */
}
And in the android/app/build.gradle on the top file
apply plugin: "com.android.application"
apply plugin: 'com.google.gms.google-services' /* Add this line */
Clean the project and run: production npx react-native run-android
That´s works for me :)
I was experiencing a similar issue connecting to analytics on Firebase. I had all the configurations mentioned above for web and ios. I wanted to set up analytics for my expo app and only use the expo-firebase-analytics library. I was testing the connection from an ios simulator.
My issue was simply resolved by adding a second app to my Firebase for the ios platform. The GoogleService-Info.plist file was auto-generated in Firebase and available to download and be placed in my project.
Initially, I only added an app to my Firebase for the web platform so I was unable to establish a connection between the ios simulator and the analytics on Firebase.
I have the following file: deposit-form.js.
With the following code:
new Vue({
el: '#app',
data: {
title: 'title',
depositForm: {
chosenMethod: 'online',
payMethods: [
{ text: 'Already paid via Venmo', value: 'venmo' },
{ text: 'Pay online', value: 'online' },
{ text: 'In-person payment', value: 'person' }
],
},
},
methods: {
submitDeposit: function() {
$.ajax({
url: 'http://localhost:8000/api/v1/deposit/',
type:'post',
data: $('#deposit-form').serialize(),
success: function() {
$('#content').fadeOut('slow', function() {
// Animation complete.
$('#msg-success').addClass('d-block');
});
},
error: function(e) {
console.log(e.responseText);
},
});
},
showFileName: function(event) {
var fileData = event.target.files[0];
var fileName = fileData.name;
$('#file-name').text('selected file: ' + fileName);
},
},
});
I'm having problems on how to setup Jest, how to import the VueJs functions inside 'methods' to make the tests with Jest.
How should be my code on the deposit-form.test.js ?
The first thing you need to do is export Vue app instance.
// deposit-form.js
import Vue from 'vue/dist/vue.common';
export default new Vue({
el: '#app',
data: {...},
...
});
Now you can use this code in your spec file. But now you need to have #app element before running tests. This can be done using the jest setup file. I will explain why it's needed. When you import your main file (deposit-form.js) into a test, an instance of Vue is created in your main file with new. Vue is trying to mount the application into #app element. But this element is not in your DOM. That is why you need to add this element just before running the tests.
In this file you also can import jQuery globally to use it in your tests without import separately.
// jest-env.js
import $ from 'jquery';
global.$ = $;
global.jQuery = $;
const mainAppElement = document.createElement('div');
mainAppElement.id = 'app';
document.body.appendChild(mainAppElement);
Jest setup file must be specified in the jest configuration section in package.json.
// package.json
{
...,
"dependencies": {
"jquery": "^3.3.1",
"vue": "^2.6.7"
},
"devDependencies": {
"#babel/core": "^7.0.0",
"#babel/plugin-transform-modules-commonjs": "^7.2.0",
"#babel/preset-env": "^7.3.4",
"#vue/test-utils": "^1.0.0-beta.29",
"babel-core": "^7.0.0-bridge.0",
"babel-jest": "^24.1.0",
"babel-loader": "^8.0.5",
"babel-preset-vue": "^2.0.2",
"jest": "^24.1.0",
"vue-jest": "^3.0.3",
"vue-template-compiler": "^2.6.7",
"webpack": "^4.29.5",
"webpack-cli": "^3.2.3"
},
"scripts": {
"test": "./node_modules/.bin/jest --passWithNoTests",
"dev": "webpack --mode development --module-bind js=babel-loader",
"build": "webpack --mode production --module-bind js=babel-loader"
},
"jest": {
"moduleFileExtensions": [
"js",
"json",
"vue"
],
"transform": {
"^.+\\.js$": "<rootDir>/node_modules/babel-jest",
".*\\.(vue)$": "<rootDir>/node_modules/vue-jest"
},
"setupFiles": [
"<rootDir>/jest-env.js"
]
}
}
Also, you probably need to configure Babel to use the features of ES6 in your projects and tests. This is not necessary if you follow the commonjs-style in your code. Basic .babelrc file contains next code:
// .babelrc
{
"presets": [
[
"#babel/preset-env",
{
"useBuiltIns": "entry",
"targets": {
"browsers": [
"last 2 versions"
]
}
}
],
"vue",
],
"plugins": [
"#babel/plugin-transform-modules-commonjs",
]
}
Now you can write your tests.
// deposit-form.test.js
import App from './deposit-form';
describe('Vue test sample.', () => {
afterEach(() => {
const mainElement = document.getElementById('app');
if (mainElement) {
mainElement.innerHTML = '';
}
});
it('Should mount to DOM.', () => {
// Next line is bad practice =)
expect(App._isMounted).toBeTruthy();
// You have access to your methods
App.submitDeposit();
});
});
My recommendation is to learn Vue Test Utils Guides and start to divide your code into components. With the current approach, you lose all the power of components and the ability to test vue-applications.
I updated my answer a bit. As I understood from the comment to the answer, you connect the libraries on the page as separate files. Here is my mistake. I didn't ask if the build system is being used. Code in my examples is written in the ECMA-2015 standard. But, unfortunately, browsers do not fully support it. You need an transpiler that converts our files into a format that is understandable for browsers. It sounds hard. But it's not a problem. I updated the contents of the file package.json in response. Now it only remains to create an input file for the assembly and run the assembly itself.
The input file is simple.
// index.js
import './deposit-form';
The build is started with the following commands from terminal.
# for development mode
$ yarn run dev
# or
$ npm run dev
# for production mode
$ yarn run build
# or
$ npm run build
The output file will be placed in the directory ./dist/. Now instead of separate files you need to connect only one. It contains all the necessary for the library and your code.
I used webpack to build. More information about it can be found in the documentation. Good example you can find in this article.
As part of my CD pipeline, I am setting up a Google Cloud Function to handle new repo pushes, create docker images and push them to the registry. I have all working on a VM but there is no need to have one running 24x7 just for this.
So, looking over NodeJS reference library, I can't find a way to push an image to a registry using node. Seems like there is no registry or build sdk for node?
Basically, all I need is to execute this command from a cloud function:
gcloud builds submit --tag gcr.io/my_project/my_image.
It's quite possible to do this using the Cloud Build API. Here's a simple example using the client libary for Node.js.
exports.createDockerBuild = async (req, res) => {
const google = require('googleapis').google;
const cloudbuild = google.cloudbuild({version: 'v1'});
const client = await google.auth.getClient({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
});
const projectId = await google.auth.getProjectId();
const resource = {
"source": {
"storageSource": {
"bucket": "my-source-bucket",
"object": "my-nodejs-source.tar.gz"
}
},
"steps": [{
"name": "gcr.io/cloud-builders/docker",
"args": [
"build",
"-t",
"gcr.io/my-project-name/my-nodejs-image",
"standard-hello-world"
]
}],
"images": ["gcr.io/$PROJECT_ID/my-nodejs-image"]
};
const params = {projectId, resource, auth: client};
result= await cloudbuild.projects.builds.create(params);
res.status(200).send("200 - Build Submitted");
};
My source code was in a bucket but you could pull it from a repo just as easily.
Bear in mind that you'll need to use the Node.js 8 beta runtime for the async stuff to work.
I am assigned this task that I have to retrieve transactions from a block on any blockchain network and create a log file using GO programming language. I have searched ethereum blockchain and tried to do the same using geth client but it makes me download the whole blockchain which is more than 100gb. So my question is, is there any way to access a block on any blockchain and read it's transactions and use the same to create a log file. I just need some head up. Help appreciated. thanks
Please use truffle Ganache ethereum client.
Download from
http://truffleframework.com/ganache/
I have created NodeJS code to read transaction from latest block.
Step 1: Install nodeJS and NPM if not installed in your machine.
Step 2: Create new folder "demo" and create new package.json file. Place below code in package.json file
{
"name": "transactionRead",
"version": "1.0.0",
"description": "Blockchain Transaction Read",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"dependencies": {
"web3": "^0.19.0"
},
"author": "",
"license": "ISC"
}
Create index.js file and place below code.
var Web3 = require('web3');
var fs = require('fs');
//Create a log file to store transaction
fs.writeFile('log.txt', 'Hello Transaction!', function (err) {
if (err) throw err;
console.log('Created!');
});
// create an instance of web3 using the HTTP provider.
// NOTE in mist web3 is already available, so check first if it's available before instantiating
if (typeof web3 !== 'undefined') {
web3 = new Web3(web3.currentProvider);
} else {
// set the provider you want from Web3.providers
web3 = new Web3(new Web3.providers.HttpProvider("http://localhost:7545"));
}
// Watch for blockchain transaction, if found changes fetch the transaction data
var filter = web3.eth.filter('latest', function (error, blockHash) {
if (!error) {
var block = web3.eth.getBlock(blockHash, true);
if (block.transactions.length > 0) {
console.log("found " + block.transactions.length + " transactions in block " + blockHash);
fs.appendFile('log.txt', JSON.stringify(block.transactions), function (err) {
if (err) throw err;
console.log('Updated!');
});
console.log(JSON.stringify(block.transactions));
} else {
console.log("no transaction in block: " + blockHash);
}
}
});
Step 4: Run $ node index.js command through command line
Let me know if need any help.
Thanks,
AWS SES works with Lex Test Chatbot but after the chatbot is published with Slack app it does not work( doesn't trigger email service). However there does not seem to be any problem with Lambda function as I am getting the response text back in slack. And i don't think there is a way to check the error why slack is making the problem.
Lambda Function:
var aws = require('aws-sdk');
var ses = new aws.SES({
region: 'us-east-1'
});
exports.handler = function(event, context, callback) {
var eParams = {
Destination: {
ToAddresses: [event.currentIntent.slots.Email]
},
Message: {
Body: {
Text: {
Data: "Hi, How are you?"
}
},
Subject: {
Data: "Title"
}
},
Source: "abc#gmail.com"
};
var email = ses.sendEmail(eParams, function(err, data) {
if (err)
else {
context.succeed(event);
}
});
callback(null, {
"dialogAction": {
"type": "ConfirmIntent",
"fulfillmentState": "Fulfilled",
"message": {
"contentType": "PlainText",
"content": "message to convey to the user, i.e. Are you sure you want a large pizza?"
}
}
});
};
Edit 1: I figured the issue is that i am not getting the values in [event.currentIntent.slots.Email] when i publish my Lex bot on Slack.
Try to below steps to identify the root cause:
Make sure you have configured your bot with Slack correctly with this step-by-step tutorial.
If your bot works fine in your test bot (inside LEX) but not on Slack make sure you have published the latest version of your bot.
Try this below code on your AWS Lambda and see what you get in return.
callback(null, {
"dialogAction": {
"type": "ConfirmIntent",
"fulfillmentState": "Fulfilled",
"message": {
"contentType": "PlainText",
"content": "Echo: " + JSON.stringify(event.currentIntent.slots) <-- This
}
}
});
Hope this helps.
I had a similar problem where the bot worked in the lex console but not in slack. While unrelated to email this is what I discovered.
For some reason an empty session attributes
is stored as NULL when passed to slack. Thus you can't add variables to it. You'd expect it to be {} so if it is NULL change it's value to {}
if(intentRequest.sessionAttributes == null){
intentRequest.sessionAttributes = {};
}
As per the above comment, this was what got my slack bot to line up with what my test lex chat was doing:
session_attributes = intent_request['sessionAttributes'] if intent_request['sessionAttributes'] is not None else {}