I want to add and list all the instances/VM under my project in google cloud using Jcloud api. In this code, I am assuming a node to be an instance.
I have set all the variables as required and extracted the private key from the json file. The context build takes place successfully.
images = compute.listImages() => lists all the images provided by google.
nodes = compute.listNodes() =>should list nodes but instead give null pointer exception.
Output=>
No of images 246
Exception in thread "main" java.lang.NullPointerException: group
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:229)
at org.jclouds.compute.internal.FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.checkGroup(FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.java:124)
at org.jclouds.compute.internal.FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.sharedNameForGroup(FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.java:120)
at org.jclouds.googlecomputeengine.compute.functions.FirewallTagNamingConvention$Factory.get(FirewallTagNamingConvention.java:39)
at org.jclouds.googlecomputeengine.compute.functions.InstanceToNodeMetadata.apply(InstanceToNodeMetadata.java:68)
at org.jclouds.googlecomputeengine.compute.functions.InstanceToNodeMetadata.apply(InstanceToNodeMetadata.java:43)
at com.google.common.base.Functions$FunctionComposition.apply(Functions.java:211)
at com.google.common.collect.Iterators$8.transform(Iterators.java:794)
at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48)
at com.google.common.collect.Iterators$7.computeNext(Iterators.java:646)
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
at com.google.common.collect.Iterators.addAll(Iterators.java:356)
at com.google.common.collect.Iterables.addAll(Iterables.java:350)
at com.google.common.collect.Sets.newLinkedHashSet(Sets.java:328)
at org.jclouds.compute.internal.BaseComputeService.listNodes(BaseComputeService.java:335)
at org.jclouds.examples.compute.basics.Example.main(Example.java:54)
package org.jclouds.examples.compute.basics;
import static com.google.common.base.Charsets.UTF_8;
import static org.jclouds.compute.config.ComputeServiceProperties.TIMEOUT_SCRIPT_COMPLETE;
import java.io.File;
import java.io.IOException;
import java.util.Properties;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.jclouds.ContextBuilder;
import org.jclouds.compute.ComputeService;
import org.jclouds.compute.ComputeServiceContext;
import org.jclouds.compute.domain.ComputeMetadata;
import org.jclouds.compute.domain.Image;
import org.jclouds.domain.Credentials;
import org.jclouds.enterprise.config.EnterpriseConfigurationModule;
import org.jclouds.googlecloud.GoogleCredentialsFromJson;
import org.jclouds.logging.slf4j.config.SLF4JLoggingModule;
import org.jclouds.sshj.config.SshjSshClientModule;
import com.google.common.base.Supplier;
import com.google.common.collect.ImmutableSet;
import com.google.common.io.Files;
import com.google.inject.Module;
public class Example {
public static void main(String[] args)
{
String provider = "google-compute-engine";
String identity = "***#developer.gserviceaccount.com";
String credential = "path to private key file ";
credential = getCredentialFromJsonKeyFile(credential);
Properties properties = new Properties();
long scriptTimeout = TimeUnit.MILLISECONDS.convert(20, TimeUnit.MINUTES);
properties.setProperty(TIMEOUT_SCRIPT_COMPLETE, scriptTimeout + "");
Iterable<Module> modules = ImmutableSet.<Module> of(
new SshjSshClientModule(),new SLF4JLoggingModule(),
new EnterpriseConfigurationModule());
ContextBuilder builder = ContextBuilder.newBuilder(provider)
.credentials(identity, credential)
.modules(modules)
.overrides(properties);
ComputeService compute=builder.buildView(ComputeServiceContext.class).getComputeService();
Set<? extends Image> images = compute.listImages();
System.out.printf(">> No of images %d%n", images.size());
Set<? extends ComputeMetadata> nodes = compute.listNodes();
System.out.printf(">> No of nodes/instances %d%n", nodes.size());
compute.getContext().close();
}
private static String getCredentialFromJsonKeyFile(String filename) {
try {
String fileContents = Files.toString(new File(filename), UTF_8);
Supplier<Credentials> credentialSupplier = new GoogleCredentialsFromJson(fileContents);
String credential = credentialSupplier.get().credential;
return credential;
} catch (IOException e) {
System.err.println("Exception reading private key from '%s': " + filename);
e.printStackTrace();
System.exit(1);
return null;
}
}
}
Related
I am using easymock framework to write UT's. I have one class which uses java.util.Properties in public function. Below is sample code:
public URI generateSASURI() throws CloudStorageException, DataProcessingException {
final String token = "token";
try {
return new URI(URIUtils.getAssetURI(azureStorageProperties.getProperty(StorageConstant.STORAGE_ACCOUNT_NAME)
, cloudBlob).toString() + "?" + token);
} catch (URISyntaxException e) {
throw new DataProcessingException("Error while creating SAS URL " + e.getMessage(), e);
}
}
where azureStorageProperties is instance of java.util.Properties, which is created as bean and injected in class.
Now while writing unit test case , i am trying to mock Properties azureStorageProperties, i am getting below error:
Unable to evaluate the expression Method threw 'java.lang.IllegalStateException' exception.
Below is ss:
Below is class to be tested:
package com.company.ops.azure.storage;
import com.company.ops.azure.storage.constants.StorageConstant;
import com.company.ops.azure.storage.dto.CloudBlob;
import com.company.ops.cloudopsfacade.dto.SignedUrlDTO;
import com.company.ops.cloudopsfacade.exception.storage.CloudStorageException;
import com.company.ops.cloudopsfacade.exception.storage.DataProcessingException;
import com.company.ops.azure.storage.util.URIUtils;
import com.company.ops.cloudopsfacade.storage.ICloudStorageClient;
import com.microsoft.azure.storage.blob.CloudBlobClient;
import com.microsoft.azure.storage.blob.SharedAccessBlobHeaders;
import com.microsoft.azure.storage.blob.SharedAccessBlobPermissions;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import java.net.MalformedURLException;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.EnumSet;
import java.util.Optional;
import java.util.Properties;
import java.util.Set;
public class StorageClient implements ICloudStorageClient {
private static final Logger LOG = LoggerFactory.getLogger(StorageClient.class);
#Inject
private CloudBlobClient cloudBlobClient;
#Inject
Properties azureStorageProperties;
#Inject
private SASTokenGenerator sasTokenGenerator;
private long sasUrlDuration;
private Optional<String> sasUrlDurationKey;
public StorageClient() {
sasUrlDurationKey = Optional.ofNullable(System.getenv(StorageConstant.SAS_URL_DURATION_KEY));
if (sasUrlDurationKey.isPresent()) {
try {
sasUrlDuration = Integer.parseInt(sasUrlDurationKey.get());
} catch(NumberFormatException ex) {
LOG.debug("sasURLDurationKey invalid" + ex.getMessage());
sasUrlDuration = StorageConstant.DEFAULT_SAS_URL_DURATION;
}
} else {
sasUrlDuration = StorageConstant.DEFAULT_SAS_URL_DURATION;
}
}
//NOTE: This constructor is just created for test case. As #Mock of final class is NOT supported in easymock
public StorageClient(long sasUrlDuration, CloudBlobClient cloudBlobClient) {
this.sasUrlDuration = sasUrlDuration;
this.cloudBlobClient = cloudBlobClient;
}
/**
*
* #param containerName
* #param blob
* #param expiryTime
* #return {#link URI}
* #throws CloudStorageException
* #throws DataProcessingException
*/
private URI generateSASURI(String containerName,
String blob, Long expiryTime) throws CloudStorageException, DataProcessingException {
CloudBlob cloudBlob = new CloudBlob(Optional.ofNullable(containerName)
.orElseThrow(() -> new CloudStorageException("container name is null")),
Optional.ofNullable(blob).orElseThrow(() -> new CloudStorageException("blob name is null")));
//#TODO: Need to check permissions: Currently create and write assigned
final Set<SharedAccessBlobPermissions> permissions = EnumSet.of(SharedAccessBlobPermissions.WRITE, SharedAccessBlobPermissions.CREATE);
final SharedAccessBlobHeaders contentHeaders = new SharedAccessBlobHeaders();
//in case if duration need to set manually via api
if (Optional.ofNullable(expiryTime).isPresent()) {
sasUrlDuration = expiryTime;
}
sasTokenGenerator.sasTokenGeneratorInitializer(
cloudBlobClient,
cloudBlob,
permissions,
sasUrlDuration,
contentHeaders);
final String token = sasTokenGenerator.getToken();
try {
return new URI(URIUtils.getAssetURI(azureStorageProperties.getProperty(StorageConstant.STORAGE_ACCOUNT_NAME)
, cloudBlob).toString() + "?" + token);
} catch (URISyntaxException e) {
throw new DataProcessingException("Error while creating SAS URL " + e.getMessage(), e);
}
}
/**
*
* #param containerName
* #param blob
* #param expiryTime
* #return {#link URI}
* #throws CloudStorageException
* #throws DataProcessingException
*/
#Override
public SignedUrlDTO generateSignedUrl(String containerName, String blob, Long expiryTime)
throws CloudStorageException, DataProcessingException, MalformedURLException {
try {
URI uri = generateSASURI(containerName, blob, expiryTime);
SignedUrlDTO signedUrlDTO = new SignedUrlDTO();
signedUrlDTO.setSignedURL(uri.toURL());
return signedUrlDTO;
} catch (DataProcessingException ex) {
LOG.error(ex.getMessage());
throw ex;
} catch (CloudStorageException ex) {
LOG.error(ex.getMessage());
throw ex;
} catch (MalformedURLException e) {
LOG.error("Unable to get URL");
throw e;
}
}
}
Below is complete code for UT
package com.company.ops.azure.storage;
import com.company.ops.azure.constants.AzureConstants;
import com.company.ops.azure.storage.constants.StorageConstant;
import com.company.ops.azure.storage.util.URIUtils;
import com.company.ops.cloudopsfacade.dto.SignedUrlDTO;
import com.company.ops.cloudopsfacade.exception.storage.CloudStorageException;
import com.company.ops.cloudopsfacade.exception.storage.DataProcessingException;
import com.microsoft.azure.storage.StorageCredentials;
import com.microsoft.azure.storage.StorageCredentialsAccountAndKey;
import com.microsoft.azure.storage.blob.CloudBlobClient;
import org.easymock.*;
import static org.easymock.EasyMock.*;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import javax.inject.Inject;
import java.net.MalformedURLException;
import java.net.URISyntaxException;
import java.util.Properties;
#RunWith(EasyMockRunner.class)
public class StorageClientTest extends EasyMockSupport {
private CloudBlobClient cloudBlobClient;
#Mock
private SASTokenGenerator sasTokenGenerator;
#Mock
private Properties azureStorageProperties;
#Mock
private StorageClient storageClient; /*= /*new StorageClient(StorageConstant.DEFAULT_SAS_URL_DURATION,
cloudBlobClient, azureStorageProperties);*/
#Before
public void setup() throws URISyntaxException {
resetAll();
}
#After
public void tearDown() {
verifyAll();
}
#Test
public void testGenerateSASURI() throws MalformedURLException, CloudStorageException, DataProcessingException {
String containerName = "myprojectlocal";
String blob = "testfile";
Long expiryTime = 1000L;
/*Properties azureStorageProperties = EasyMock.createMockBuilder(Properties.class)
.addMockedMethod("getProperty", String.class).createMock();*/
//azureStorageProperties = new Properties();
// azureStorageProperties.setProperty(StorageConstant.STORAGE_ACCOUNT_NAME, "storage-account-name");
expect(azureStorageProperties.getProperty(StorageConstant.STORAGE_ACCOUNT_NAME)).andReturn("storage-account-name");
expect(sasTokenGenerator.getToken()).andReturn("token");
sasTokenGenerator.sasTokenGeneratorInitializer(anyObject(CloudBlobClient.class),
anyObject(),
anyObject(),
anyLong(),
anyObject());
expectLastCall();
//azureStorageProperties.getProperty(anyString()); //.andReturn("storage-account-name");
//expectLastCall();
//expect(azureStorageProperties.getProperty(StorageConstant.STORAGE_ACCOUNT_NAME)).andReturn("storage-account-name");
replayAll();
SignedUrlDTO signedUrlDTO = storageClient.generateSignedUrl(containerName, blob, expiryTime);
Assert.assertNotNull(signedUrlDTO);
}
private CloudBlobClient getCloudBlobClient() {
final StorageCredentials credentials = new StorageCredentialsAccountAndKey(TestConstants.STORAGE_ACCOUNT_NAME,
TestConstants.STORAGE_ACCOUNT_KEY);
try {
return new CloudBlobClient(URIUtils.getStorageAccountURI(TestConstants.STORAGE_ACCOUNT_NAME), credentials);
} catch (URISyntaxException e) {}
return null;
}
}
I am using Flink v1.7.1. When I finished a Flink streaming job with tableSource, SQL and tableSink, I have no idea how to add a unit test for it.
I found a good example about how to testing flink sql with the help of user mailing list, here is a example.
package org.apache.flink.table.runtime.stream.sql;
import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.api.java.tuple.Tuple5;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.java.StreamTableEnvironment;
import org.apache.flink.table.runtime.utils.JavaStreamTestData;
import org.apache.flink.table.runtime.utils.StreamITCase;
import org.apache.flink.test.util.AbstractTestBase;
import org.apache.flink.types.Row;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
/**
* Integration tests for streaming SQL.
*/
public class JavaSqlITCase extends AbstractTestBase {
#Test
public void testRowRegisterRowWithNames() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnv = TableEnvironment.getTableEnvironment(env);
StreamITCase.clear();
List<Row> data = new ArrayList<>();
data.add(Row.of(1, 1L, "Hi"));
data.add(Row.of(2, 2L, "Hello"));
data.add(Row.of(3, 2L, "Hello world"));
TypeInformation<?>[] types = {
BasicTypeInfo.INT_TYPE_INFO,
BasicTypeInfo.LONG_TYPE_INFO,
BasicTypeInfo.STRING_TYPE_INFO};
String[] names = {"a", "b", "c"};
RowTypeInfo typeInfo = new RowTypeInfo(types, names);
DataStream<Row> ds = env.fromCollection(data).returns(typeInfo);
Table in = tableEnv.fromDataStream(ds, "a,b,c");
tableEnv.registerTable("MyTableRow", in);
String sqlQuery = "SELECT a,c FROM MyTableRow";
Table result = tableEnv.sqlQuery(sqlQuery);
DataStream<Row> resultSet = tableEnv.toAppendStream(result, Row.class);
resultSet.addSink(new StreamITCase.StringSink<Row>());
env.execute();
List<String> expected = new ArrayList<>();
expected.add("1,Hi");
expected.add("2,Hello");
expected.add("3,Hello world");
StreamITCase.compareWithList(expected);
}
}
the related code is here
I have a CSV file which contain header and data. I want to insert file(data) into Bigquery. I have written code to read file header and used for table/column mapping. I made it as dynamic file import(In Bigquery I have created one static empty table).
My cloud dataflow got executed successfully but data was not inserted into my Bigquery table. I am not sure what would be the problem.
I ran below code in Eclipse:
package com.coe.cog;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.StringReader;
import java.util.*;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.TextIO;
import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.MapElements;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.transforms.SimpleFunction;
import org.apache.beam.sdk.values.PCollection;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.api.services.bigquery.model.TableReference;
import com.google.api.services.bigquery.model.TableRow;
public class DemoPipeline_SameCodeProcess {
private static final Logger LOG = LoggerFactory.getLogger(DemoPipeline_SameCodeProcess.class);
//Get Project,dataset & table
public static TableReference getGCDSTableReference() {
TableReference ref = new TableReference();
ref.setProjectId("myownprojectbqs");
ref.setDatasetId("DS_Employee");
ref.setTableId("tLoadEmp");
return ref;
}
//split input file with header and data sepearately
static class TransformToTable extends DoFn<String, TableRow> {
#ProcessElement
public void processElement(ProcessContext c) throws IOException {
BufferedReader br = null;
String line = "";
String csvSplitBy = ",";
Integer incFlg = 0;
StringReader strdr = new StringReader(c.element().toString());
br = new BufferedReader(strdr);
line = br.readLine(); //Header as FirstLine
String[] colmnsHeader = line.split(csvSplitBy); //Only Header array
while ((line = br.readLine()) != null) {
// Content of the file excluding header
String[] colmnsList = line.split(csvSplitBy);
TableRow row = new TableRow();
for (int i = 0; i < colmnsList.length; i++) {
row.set(colmnsHeader[i], colmnsList[i]);
}
c.output(row);
}
}
}
public static void main(String[] args) {
MyOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(MyOptions.class);
options.setTempLocation("gs://demo-bucket-data/temp");
Pipeline p = Pipeline.create(options);
PCollection<String> lines = p.apply("Read From Storage", TextIO.read().from("gs://demo-bucket-data/Demo/Test/MasterLoad_WithHeader.csv"));
PCollection<TableRow> rows = lines.apply("Transform To Table",ParDo.of(new TransformToTable()));
rows.apply("Write To Table",BigQueryIO.writeTableRows().to(getGCDSTableReference())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER));
p.run();
}
}
Input File Format(MasterLoad_WithHeader.csv):-
ID,NAME,AGE,SEX
1,John,25,M
2,Smith,28,M
3,Josephine,22,F
I have 2 restful service method getCustomerJson and getCustomerXML in a class CustomerResource where i am using jersey API for Restful Webservices. All the parameters of the 2 methods are same except one produces xml and other produces json.
When i am using the a HTTP GET request with header Content-Type="application/json" it always invokes the getCustomerXML method which returns xml.
Can someone explain me how jersey works in this kind of situation ?
import java.net.URI;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.atomic.AtomicInteger;
import javax.ws.rs.Consumes;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.PUT;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.HttpHeaders;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import domain.Customer;
#Path("/customers")
public class CustomerResource {
private static Map<Integer, Customer> customerDB = new ConcurrentHashMap<Integer, Customer>();
private static AtomicInteger idCounter = new AtomicInteger();
// Constructor
public CustomerResource() {
}
#GET
#Produces(MediaType.TEXT_PLAIN)
public String sayHello() {
return "Hello Kundan !!!";
}
#GET
#Path("{id}")
#Produces("application/xml")
public Customer getCustomerXML(#PathParam("id") int id, #Context HttpHeaders header) {
final Customer customer = customerDB.get(id);
List<String> contentList = header.getRequestHeader("Content-Type");
List<String> languageList = header.getRequestHeader("Accept-Language");
List<String> compressionFormatList = header.getRequestHeader("Content-Type");
if (customer == null) {
throw new WebApplicationException(Response.Status.NOT_FOUND);
}
return customer;
}
#GET
#Path("{id}")
#Produces("application/json")
public Customer getCustomerJson(#PathParam("id") int id) {
final Customer customer = customerDB.get(id);
if (customer == null) {
throw new WebApplicationException(Response.Status.NOT_FOUND);
}
return customer;
}
#POST
#Consumes("application/xml")
public Response createCustomer(Customer customer) {
customer.setId(idCounter.incrementAndGet());
customerDB.put(customer.getId(), customer);
System.out.println("Created customer " + customer.getId());
return Response.created(URI.create("/customers/" + customer.getId())).build();
}
#PUT
#Path("{id}")
#Consumes("application/xml")
public void updateCustomer(#PathParam("id") int id, Customer customer) {
Customer current = customerDB.get(id);
if (current == null)
throw new WebApplicationException(Response.Status.NOT_FOUND);
current.setFirstName(customer.getFirstName());
current.setLastName(customer.getLastName());
current.setStreet(customer.getStreet());
current.setCity(customer.getCity());
current.setState(customer.getState());
current.setZip(customer.getZip());
current.setCountry(customer.getCountry());
}
#DELETE
#Path("{id}")
public void deleteCustomer(#PathParam("id") int id) {
customerDB.remove(id);
System.out.println("Deleted !");
}
}
Use Accept: application/json. Accept tells the server what type you want back. Content-Type if for the type of data you are sending to the server, like with a POST request.
I'm writing a fairly simple restful web service project in Netbeans (used the Maven Web Application template). I am trying to run it on a Glassfish 4.1 server. I have used Tomcat in the past, but that's not really an option here. Basically, my problem is that I run the project, the server starts, but I just get a 404 error when I try to access the service in the browser.
Here is my source code:
package jp.go.aist.limrs;
import com.hp.hpl.jena.rdf.model.Model;
import java.io.IOException;
import java.io.InputStream;
import java.io.StringWriter;
import java.io.UnsupportedEncodingException;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URI;
import java.net.URISyntaxException;
import java.net.URL;
import java.net.URLEncoder;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.websocket.server.PathParam;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.MediaType;
import org.apache.commons.io.IOUtils;
#Path("/target")
public class ParserService
{
public static final String SERVER_LOC = "http://localhost:8080/LiMRS/";
public static final String MAPPINGS_LOC = "export.txt";
private String targetUrl;
private String microData;
private Model uDataModel;
private Model mappingsModel;
public ParserService() {}
public ParserService( String url )
{
this.targetUrl = url;
try {
parseMicro(url);
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
#GET
#Path("/{url:.+}")
#Produces(MediaType.TEXT_PLAIN)
public String getMicro(#PathParam("url") String target)
{
this.targetUrl = target;
String domain = "_";
try {
URI uri = new URI(this.targetUrl);
domain = uri.getHost();
System.out.println("Domain is " + domain + "\n\n\n");
} catch (URISyntaxException ex) {
Logger.getLogger(jp.go.aist.LiMRS.LiMRService.class.getName()).log(Level.SEVERE, null, ex);
}
this.microData = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
"<rdf:RDF xml:base=\"http://dbpedia.org/ontology/\" " +
"xmlns:_=\"" + domain + "\">\n\n";
try
{
parseMicro(URLEncoder.encode(this.targetUrl, "UTF-8"));
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
} catch (MalformedURLException e) {
return "";
}
return this.microData;
}
private void parseMicro(String target) throws MalformedURLException
{
try {
URL url = new URL("http://getschema.org/microdataextractor?url=" + target + "&out=rdf");
HttpURLConnection conn;
try {
conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
InputStream ins = conn.getInputStream();
StringWriter writer = new StringWriter();
IOUtils.copy(ins, writer, null);
this.microData += writer.toString() + ".";
} catch (IOException ex) {
Logger.getLogger(jp.go.aist.LiMRS.LiMRService.class.getName()).log(Level.SEVERE, null, ex);
}
} catch (MalformedURLException ex) {
Logger.getLogger(jp.go.aist.LiMRS.LiMRService.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
The URL I'm using to test the service is: http://localhost:8080/LiMRS/target/http://www.rottentomatoes.com/m/jurassic_park/
(I know the URL is unencoded. There are forward slashes in the 'resource' part of the URL, after "/target/", but that is taken care of by the regex in the code and is not source of the problem.)
It's possible the problem is with the server itself, I don't know if there is any special configuration that needs to be done to Glassfish or if you can just run the project outright. I don't have a web.xml file. Unless I'm mistaken, I don't think I need one. What am I missing here?
You're going to need a web.xml or a Application/ResourceConfig subclass to configure the application. If you don't want to use a web.xml, the easiest thing you can do is have an empty Application class annotated with #ApplicationPath. This will cause the registration of all #Path classes you have
#ApplicationPath("/")
public class JaxRsApplication extends Application {}
You can see more options here