How does the golang chaincode in the marbles nodejs example works? - blockchain

I have got this example https://github.com/IBM-Blockchain/marbles run locally. I saw that the example downloaded the golang chaincode from https://github.com/ibm-blockchain/marbles-chaincode. And the chaincode was stored on harddisk at /marbles/node_modules/ibm-blockchain-js/temp/unzip.
Could you please explain how the golang chaincode was executed inside the nodejs code?

I haven't looked at the Marbles app in detail, but generally speaking, the nodejs code is just a client to the validator network, and the validator is processing the golang based chaincode in a way that is completely decoupled from the nodejs based client. In this process, the validator downloads/acquires the chaincode and compiles it locally within an isolating container. You could look at the process like [golang::chaincode]->[nodejs::client]->(network)->[golang::validator]->[golang::container]. So the first and last parts are golang/chaincode related, the stuff that happens in the middle is more or less a transport. I.e. the fact that the client is nodejs and the validator is golang matter little here.

The Golang code that implements the Marbles chaincode (aka smart contract) does not get executed inside the Node.js app. The chaincode is what the application interacts with to modify state variables stored in the blockchain. State in this case is: what marbles exist, who their owner is, what color it is, etc. But the chaincode itself (the Golang code) is packaged as a docker container, deployed to the blockchain, and is up and running waiting for transactions. The Node.js code constructs and sends these transactions to the docker container, receives results of the chaincode execution, and updates the application view of the current state.
Just FYI, the Marbles app was implemented to demonstrate how to implement an application running on top of the Hyperledger Fabric project. Hyperledger currently only fully supports Golang as it's smart contract language, but more languages are coming soon.

As described here,
Interacting with the cc is done with a HTTP REST call to a peer on the
network. The ibc-js SDK will abstract the details of the REST calls
away. This allows us to use dot notation to call our GoLang functions
(such as chaincode.invoke.init_marble(args)).
The user will interact with our Node.js application in their browser.
This client side JS code will open a websocket to the backend Node.js
application. The client JS will send messages to the backend when the
user interacts with the site.
The backend Node.js will send HTTP
requests (via the SDK) to a blockchain peer to carry out the user's
actions. The peer will communicate to its chaincode container at its
leisure. Note that the previous HTTP request was really a 'submission'
of chaincode to be run, it will actually run at a later date (usually
milliseconds).
The cc container will carry out the desired operation
and record it to the ledger. ie create/transfer a marble.

Related

NodeJS server send data to C++ process

I have a nodeJS server which receives user POST/Streaming requests from a web-UI.
I have a C++ back-end engine process which does some calculations and sends API calls to other 3rd party services. The API call requires certain info provided by the web users.
My question is what is the best solution to pass the request data received on NodeJS and send over to the C++ process?
WebUI -> NodeJS ->???->> C++ engine
Make your C++ application listen on a TCP or Unix socket.
Make your NodeJs application connect to that socket and exchange messages. For messages you can use Google Protocol Buffers, JSON, etc..
If the information what you have is still at JavaScript layer, then you have to implement C/C++ Addons implementation. If you already have some type of native module, then you may follow the same design based on that (very likely existing module could be based on NAN). If you are plan to introduce a brand new native module then it is a good time to consider N-API. You can get more information about it from.
https://nodejs.org/dist/latest-v11.x/docs/api/n-api.html
https://github.com/nodejs/node-addon-api

How to avoid Hyper ledger Composer Rest server restart while upgrading(with change in model files) composer network installed?

We have a working setup of 3 peer nodes and a multi user rest server running on 1 of the peers. Now there are multiple user cards created and imported in the rest server(using web based client) which is working fine. I can trigger transactions and query the blockchain with it.
However In case I need to upgrade my network and there is some change in model file(i.e. any participant/asset/transaction parameters changes). I need to restart rest server so that effect can be observed by WEB based client application. So my questions are:
1. Is there a way to upgrade Rest interfaces without restarting the server.
2. In case Rest server crashed or restarted is there some way to use the old cards that were created before server shutdown.
When the REST server starts you can see that it "discovers" the Business Network and then generates the End Points. The discovery is not dynamic, so that when you change the model or other element of a BNA you need to restart the REST server to re-discover the updated network. (In a live scenario I would think changes to the model are infrequent.)
Are you using multi-user mode for the REST server? Assuming that you are, then Configuring the REST server with a persistent Data Source as described in the documentation, or this tutorial should solve the problem of re-importing the cards. You could also "backup" the cards after they have been used the first time by Exporting them.

Accessing Ethereum Dapp

Is their a way to access ethereum Dapps other than the mist browser. I was thinking along the lines of a normal browser like chrome. Also, as a sub question how are some Android and IOS apps connecting to the blockchain?
You can do that through Ethereum JSON-RPC: https://github.com/ethereum/wiki/wiki/JSON-RPC
You have to use
eth_call - read from contract
eth_sendTransaction - send transaction to a contract
You must understand that you'll also need to have an Ethereum node started, most probably with unlocked account to execute transactions from it. Which means you don't want to run it on public networks, but rather in local network. That's what Mist do for you essentially.
Also, take a look at MetaMask, it provides same API for browser based app, but requires an additional plugin to be installed into a browser

What is a client in network of Hyperledger fabric peers?

What is a client in a network of Hyperledger fabric peer?
What is the role of a client?
What can qualify as a client in the Hyperledger fabric blockchain network?
have a look at this (and specifically, look into the Network Entities / Systems part):
https://github.com/hyperledger/fabric/blob/master/docs/glossary.md
I'm still rather new to this, but my understanding is that you have a) peers in a P2P network that can be either validator or non-validator - the latter existing mostly for performance purposes; and b) you have clients, who talk to peers in a client-server manner to issue queries and request transactions from the P2P network.
What can qualify as a client: basically anything that can talk to peers in this manner. (I think there are even some SDKs, but I'm concentrating on other aspects of Hyperledger, so I don't know yet.) Have a look at the IBM Marbles demo:
https://github.com/IBM-Blockchain/marbles
A client application talks to a peer over either REST or GRPC interface and submits transactions and queries to the peer to chaincodes.
A client is an end user of the application. The client invokes the smart contract by placing a request on the channel. Each smart contract has a set of endorsing pairs required. The request is picked by the required endorsing peers and executed. The resulting read-write sets are sent back to the client.
what is Client in Hyperledger :
The Hyperledger Fabric Client SDK makes it easy to use APIs
to interact with a Hyperledger Fabric blockchain.
Features:
create a new channel
send channel information to a peer to join
install chaincode on a peer
instantiate chaincode in a channel, which involves two steps: propose and transact
submitting a transaction, which also involves two steps: propose and transact
query a chaincode for the latest application state
various query capabilities:
logging utility with a built-in logger (winston)

Architecture Design for API of Cloud Service

Background:
I've a local application that process the user input for 3 second (approximately) and then return an answer (output) to the user.
(I don't want to go into details about my application in purpose of not complicate the question and keep it a pure architectural question)
My Goal:
I want to make my application a service in the cloud and expose API
(for the upcoming website and for clients that will connect the service without install the software locally)
Possible Solutions:
Deploy WCF on the cloud and use my application there, so clients can invoke the service and use my application on the cloud. (RPC style)
Use a Web-API that will insert the request into queue and then a worker role will dequeue requests and post the results to a DB, so the client will send one request for creating a request in the queue, and another request for getting the result (which the Web-API will get from the DB).
The Problems:
If I go with the WCF solution (#1) I cant handle great loads of requests, maybe 10-20 simultaneously.
If I go with the WebAPI-Queue-WorkerRole solution (#2) sometimes the client will need to request the results multiple times its can be a problem.
If I go with the WebAPI-Queue-WorkerRole solution (#2) the process isn't sync, the client will not get the result once the process of his request is done, he need to request the result.
Questions:
In the WebAPI-Queue-WorkerRole solution (#2), can I somehow alert the client once his request has processed and done ? so I can save the client multiple request (for the result).
Asking multiple times for the result isn't old stuff ? I remmemeber that 10 - 15 years ago its was accepted but now ? I know that VirusTotal API use this kind of design.
There is a better solution ? one that will handle great loads and will be sync or async (returning result to the client once it done) ?
Thank you.
If you're using Azure, why not simply fire up more servers and use load balancing to handle more load? In that way, as your load increases, you have more servers to handle the requests.
Microsoft recently made available the Azure Service Fabric, which gives you a lot of control over spinning up and shutting down these services.