I'm trying to make a HelloWorld application for Google Glass by using the GDK provided.
This is the entire codestack, I'm trying to work out the way to program this. Compiling doesn't give any errors but running it does.
package leagueMatch;
import com.google.android.glass.timeline.LiveCard;
import com.google.android.glass.timeline.TimelineManager;
import com.luisdelarosa.helloglass.R;
import android.os.Bundle;
import android.os.IBinder;
import android.app.Activity;
import android.app.Service;
import android.content.Intent;
import android.graphics.Color;
import android.view.Menu;
import android.widget.RemoteViews;
public class MainActivity extends Service {
String blue_team, purple_team, mvp, casters;
int blue_kills, purple_kills;
private LiveCard mLiveCard;
private TimelineManager mTimelineManager;
private RemoteViews mViews;
private static final String TAG = "LeagueMatchInfo";
private static final String LIVE_CARD_ID = "leaguematch";
protected void onCreate(Bundle savedInstanceState) {
super.onCreate();
//mTimelineManager = xxxxxx;
mViews.setTextViewText(blue_kills, "Lol, Let's see if this works");
mLiveCard = mTimelineManager.createLiveCard(LIVE_CARD_ID);
mLiveCard.setViews(mViews);
mLiveCard.setDirectRenderingEnabled(true);
mLiveCard.publish(LiveCard.PublishMode.SILENT);
}
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
return true;
}
#Override
public IBinder onBind(Intent intent) {
return null;
}
}
Isn't this supposed to launch a card which says "Lol, let's see if this works?"
I created a Hello World project on GitHub.
Start the app by saying "ok glass, hello world" or click on Hello World card shown on the timeline.
The live card has an option menu with two choices:
Say Hi
Close app
Please mark as answer if this helps.
It looks like you're conflating concepts from activities and services. You're right to be using a service to maintain the LiveCard. But you should override the onStartCommand method to publish the card when the service is launched. Please see the source code for the Compass, Stopwatch, and Timer for more examples.
Related
We use testng as out testing framework. We also use Lombok #Log4j2 to instantiate our log objects. I need to test some code that it logs certain messages under certain conditions.
I have seen examples using junit and Mockito. But I cannot find how to do it in testng. Switching to junit is not an option.
Edit
I have implemented a class (CaptureLogger) which extends AbstractLogger
import org.apache.logging.log4j.spi.AbstractLogger;
public class CaptureLogger extends AbstractLogger {
...
}
I am unable to to hook it up to the logger for the class under test.
CaptureLogger customLogger = (CaptureLogger) LogManager.getLogger(MyClassUnderTest.class);
generates an error message:
java.lang.ClassCastException: org.apache.logging.log4j.core.Logger cannot be cast to CaptureLogger
I have found out that LogManager.getLogger returns the Logger interface, not the Logger object (which implements the Logger interface).
How can I create an instance of my CaptureLogger?
You can define your own appender like this:
package com.xyz;
import static java.util.Collections.synchronizedList;
import java.util.ArrayList;
import java.util.List;
import org.apache.logging.log4j.core.Appender;
import org.apache.logging.log4j.core.Filter;
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.appender.AbstractAppender;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.config.plugins.PluginAttribute;
import org.apache.logging.log4j.core.config.plugins.PluginElement;
import org.apache.logging.log4j.core.config.plugins.PluginFactory;
#Plugin(name = "LogsToListAppender", category = "Core", elementType = Appender.ELEMENT_TYPE)
public class LogsToListAppender extends AbstractAppender {
private static final List<LogEvent> events = synchronizedList(new ArrayList<>());
protected LogsToListAppender(String name, Filter filter) {
super(name, filter, null);
}
#PluginFactory
public static LogsToListAppender createAppender(#PluginAttribute("name") String name,
#PluginElement("Filter") Filter filter) {
return new LogsToListAppender(name, filter);
}
#Override
public void append(LogEvent event) {
events.add(event);
}
public static List<LogEvent> getEvents() {
return events;
}
}
Then create a file called log4j2-logstolist.xml in the root of the classpath where the appender will be referenced:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" packages="com.xyz" >
<Appenders>
<LogsToListAppender name="LogsToListAppender" />
</Appenders>
<Loggers>
<Root level="TRACE">
<AppenderRef ref="LogsToListAppender" />
</Root>
</Loggers>
</Configuration>
You should take special care (to update it properly) of the attribute packages="com.xyz" (the package of your appender) or it won't be available. For more information check https://www.baeldung.com/log4j2-custom-appender
And finally create TestNG test:
package com.xyz;
import static org.testng.Assert.assertTrue;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.core.config.Configurator;
import org.testng.annotations.Test;
#Test
public class LogsTest {
static {
Configurator.initialize(null, "classpath:log4j2-logstolist.xml");
}
#Test
public void testLogs() {
// call your code that produces log, e.g.
LogManager.getLogger(LogsTest.class).trace("Hello");
assertTrue(LogsToListAppender.getEvents().size() > 0);
}
}
As you can see we are forcing Log4j2 to use the custom configuration with Configurator.initialize(null, "classpath:log4j2-logstolist.xml"); when the class is initialized (static{} block).
Keep in mind that it will be useful for you to check logger name as well, e.g. LogsToListAppender.getEvents().stream().filter(a -> CLASS_THAT_PRODUCES_LOG.class.getName().equals(a.getLoggerName())).collect(toList());
you can access the actual message using LogEvent::getMessage() method
As Long as you're using Lombok for logger generation you can't do much at the level of the source code itself with the given tools. For example, if you place #Log4j2 annotation, it generates:
private static final org.apache.logging.log4j.Logger log = org.apache.logging.log4j.LogManager.getLogger(LogExample.class);
The compiled code already comes with this line.
You can try to mock LogManager.getLogger method with PowerMockito but I don't really like this kind of tools. Stating this though since it can be a viable direction.
There are couple of ways to work with the framework itself.
One way (and I'm not familiar with Log4j2 specifically but it should offer this capability - I did something similar with Log4j 1.x many years ago) is to provide your own implementation of logger and associate it with the logger factory at the level of Log4j2 configurations.
Now if you do this, then the code generated by lombok will return your instance of logger that can memorize the messages that were logged at different levels (it's the custom logic you'll have to implement at the level of Logger).
Then the logger will have a method public List<String> getResults() and you'll call the following code during the verification phase:
public void test() {
UnderTest objectUnderTest = ...
//test test test
// verification
MyCustomLogger logger = (MyCutomLogger)LogManager.getLogger(UnderTest.class);
List<String> results = logger.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}
Another somewhat similar way I can think of is to create a custom appender that will memorize all the messages that were sent during the test. Then you could (declaratively or programmatically bind that appender to the Logger obtained by the LogFactory.getLogger for the class under test (or also for other classes depending on your actual needs).
Then let the test work and when it comes to verification - get the reference to the appender from the log4j2 system and ask for results with some public List<String> getResults() method what must exist on the appender in addition to the methods that it must implement in order to obey the Appender contract.
So the test could look something like this:
public void test () {
MyTestAppender app = createMemorizingAppender();
associateAppenderWithLoggerUnderTest(app, UnderTest.class);
UnderTest underTest = ...
// do your tests that involve logging operations
// now the verification phase:
List<String> results = app.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}
I am migrating my application from Play1.2+Java7 to Play1.4+Java8
Play1.2+Java7 my test passes OK
Play1.4+Java8 my test fails.
I have reduced the code to the minimum and reproduced the problem. Here is the main line
The model is
package models;
import play.db.jpa.Model;
import javax.persistence.Entity;
#Entity
public class Token extends Model {
public String name;
public String role;
}
The controller is
package controllers;
import models.Token;
import play.mvc.Controller;
public class Application extends Controller {
public static void index() {
renderJSON(Token.all().fetch());
}
}
The DB test configuration is
%test.application.mode=dev
%test.db.url=jdbc:h2:mem:play;MODE=MYSQL;LOCK_MODE=0
%test.jpa.ddl=create
The test is
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import org.junit.*;
import org.junit.Before;
import play.test.*;
import play.mvc.*;
import play.mvc.Http.*;
import models.*;
public class ApplicationTest extends FunctionalTest {
#Before
public void before() {
Token.deleteAll();
}
#Test
public void testThatIndexPageWorks() {
{
Response response = GET("/");
assertIsOk(response);
String content = getContent(response);
System.out.println(content);
assertFalse(content.contains("le nom"));
assertFalse(content.contains("identifier"));
}
Token t = new Token();
t.name="le nom";
t.role="identifier";
t.save();
{
Response response = GET("/");
assertIsOk(response);
String content = getContent(response);
System.out.println(content);
assertTrue(content.contains("le nom"));
assertTrue(content.contains("identifier"));
}
}
}
The behaviour is not predictable. It seems that saving entities in the tests are committed async and calling the controller depends on the threads while it did not in release 1.2
I can provide the whole project if necessary
As I do not want to use the fixtures, I have to manually sync the DB: test call of model.save() is done within a local transaction. The transaction is not closed when GET is called. the data is not flushed yet.
I thought that it was covered by
jpa FlushModeType COMMIT
It seems that it is the case in 1.2.x, but not the case in 1.4.x
I modified the test adding the code snippet below after save() and deleteAll(), and it works fine
if ( play.db.jpa.JPA.em().getTransaction().isActive()) {
play.db.jpa.JPA.em().getTransaction().commit();
play.db.jpa.JPA.em().getTransaction().begin();
}
I have created one Web Application using Servlets and JSP. Through that I have connected to alfresco repository. I am also able be to upload document in Alfresco and view document in external web application.
Now my requirement is, I have to give checkin and checkout option to those documents.
I found below rest apis for this purpuse.
But I am not getting how to use these apis in servlets to full-fill my requirment.
POST /alfresco/service/slingshot/doclib/action/cancel-checkout/site/{site}/{container}/{path}
POST /alfresco/service/slingshot/doclib/action/cancel-checkout/node/{store_type}/{store_id}/{id}
Can anyone please provide the simple steps or some piece of code to do this task?
Thanks in advance.
Please do not use the internal slingshot URLs for this. Instead, use OpenCMIS from Apache Chemistry. It will save you a lot of time and headaches and it is more portable to other repositories besides Alfresco.
The example below grabs an existing document by path, performs a checkout, then checks in a new major version of the plain text document.
package com.someco.cmis.examples;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.chemistry.opencmis.client.api.Document;
import org.apache.chemistry.opencmis.client.api.ObjectId;
import org.apache.chemistry.opencmis.client.api.Repository;
import org.apache.chemistry.opencmis.client.api.Session;
import org.apache.chemistry.opencmis.client.api.SessionFactory;
import org.apache.chemistry.opencmis.client.runtime.SessionFactoryImpl;
import org.apache.chemistry.opencmis.commons.SessionParameter;
import org.apache.chemistry.opencmis.commons.data.ContentStream;
import org.apache.chemistry.opencmis.commons.enums.BindingType;
public class CheckoutCheckinExample {
private String serviceUrl = "http://localhost:8080/alfresco/api/-default-/public/cmis/versions/1.1/atom"; // Uncomment for Atom Pub binding
private Session session = null;
public static void main(String[] args) {
CheckoutCheckinExample cce = new CheckoutCheckinExample();
cce.doExample();
}
public void doExample() {
Document doc = (Document) getSession().getObjectByPath("/test/test-plain-1.txt");
String fileName = doc.getName();
ObjectId pwcId = doc.checkOut(); // Checkout the document
Document pwc = (Document) getSession().getObject(pwcId); // Get the working copy
// Set up an updated content stream
String docText = "This is a new major version.";
byte[] content = docText.getBytes();
InputStream stream = new ByteArrayInputStream(content);
ContentStream contentStream = session.getObjectFactory().createContentStream(fileName, Long.valueOf(content.length), "text/plain", stream);
// Check in the working copy as a major version with a comment
ObjectId updatedId = pwc.checkIn(true, null, contentStream, "My new version comment");
doc = (Document) getSession().getObject(updatedId);
System.out.println("Doc is now version: " + doc.getProperty("cmis:versionLabel").getValueAsString());
}
public Session getSession() {
if (session == null) {
// default factory implementation
SessionFactory factory = SessionFactoryImpl.newInstance();
Map<String, String> parameter = new HashMap<String, String>();
// user credentials
parameter.put(SessionParameter.USER, "admin"); // <-- Replace
parameter.put(SessionParameter.PASSWORD, "admin"); // <-- Replace
// connection settings
parameter.put(SessionParameter.ATOMPUB_URL, this.serviceUrl); // Uncomment for Atom Pub binding
parameter.put(SessionParameter.BINDING_TYPE, BindingType.ATOMPUB.value()); // Uncomment for Atom Pub binding
List<Repository> repositories = factory.getRepositories(parameter);
this.session = repositories.get(0).createSession();
}
return this.session;
}
}
Note that on the version of Alfresco I tested with (5.1.e) the document must already have the versionable aspect applied for the version label to get incremented, otherwise the checkin will simply override the original.
I want to write unit tests that test the Elastic query building. I want to test that certain param values produce certain queries.
I started looking into ESTestCase. I see that you can mock a client using ESTestCase. I don't really need to mock the ES node, I just need to reproduce the query building part, but that requires the client.
Has anybody dealt with such issue?
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Map;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ESTestCase;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Ignore;
import org.junit.Test;
import com.google.common.collect.Lists;
public class SearchRequestBuilderTests extends ESTestCase {
private static Client client;
#BeforeClass
public static void initClient() {
//this client will not be hit by any request, but it needs to be a non null proper client
//that is why we create it but we don't add any transport address to it
Settings settings = Settings.builder()
.put("", createTempDir().toString())
.build();
client = TransportClient.builder().settings(settings).build();
}
#AfterClass
public static void closeClient() {
client.close();
client = null;
}
public static Map<String, String> createSampleSearchParams() {
Map<String, String> searchParams = new HashMap<>();
searchParams.put(SenseneConstants.ADC_PARAM, "US");
searchParams.put(SenseneConstants.FETCH_SIZE_QUERY_PARAM, "10");
searchParams.put(SenseneConstants.QUERY_PARAM, "some query");
searchParams.put(SenseneConstants.LOCATION_QUERY_PARAM, "");
searchParams.put(SenseneConstants.RADIUS_QUERY_PARAM, "20");
searchParams.put(SenseneConstants.DISTANCE_UNIT_PARAM, DistanceUnit.MILES.name());
searchParams.put(SenseneConstants.GEO_DISTANCE_PARAM, "true");
return searchParams;
}
#Test
public void test() {
BasicSearcher searcher = new BasicSearcher(client); // this is my application's searcher
Map<String, String> searchParams = createSampleSearchParams();
ArrayList<String> filterQueries = Lists.newArrayList();
SearchRequest searchRequest = SearchRequest.create(searchParams, filterQueries);
MySearchRequestBuilder medleyReqBuilder = new MySearchRequestBuilder.Builder(client, "my_index", searchRequest).build();
SearchRequestBuilder searchRequestBuilder = medleyReqBuilder.constructSearchRequestBuilder();
System.out.print(searchRequestBuilder.toString());
// Here I want to assert that the search request builder output is what it should be for the above client params
}
}
I get this, and nothing in the code runs:
Assertions mismatch: -ea was not specified but -Dtests.asserts=true
REPRODUCE WITH: mvn test -Pdev -Dtests.seed=5F09BEDD71BBD14E - Dtests.class=SearchRequestBuilderTests -Dtests.locale=en_US -Dtests.timezone=America/Los_Angeles
NOTE: test params are: codec=null, sim=null, locale=null, timezone=(null)
NOTE: Mac OS X 10.10.5 x86_64/Oracle Corporation 1.7.0_80 (64-bit)/cpus=4,threads=1,free=122894936,total=128974848
NOTE: All tests run in this JVM: [SearchRequestBuilderTests]
Obviously a bit late but...
So this actually has nothing to do with the ES Testing framework but rather your run settings. Assuming you are running this in eclipse, this is actually a duplicate of Assertions mismatch: -ea was not specified but -Dtests.asserts=true.
eclipse preference -> junit -> Add -ea checkbox enable.
right click on the eclipse project -> run as -> run configure -> arguments tab -> add the -ea option in vm arguments
So I'm making a game with Haxe and Haxepunk. Fine. Except that when I export to C++, nothing is rendering! I posted this previously on the Haxepunk boards, so more info can be found here. Here's an excerpt from the Haxepunk thread;
I can still compile it just fine, but nothing in the game is actually rendering except for the background color I defined. The console still works and renders fine, though. The HaxePunk console tells me Atlases using BitmapData will not be managed.
I'm using Ash's component-entity system, and I'm not using Haxe's Entities. The relevant objects have a Visible component attached to them, which looks like this;
package game.component;
import com.haxepunk.Graphic;
import com.haxepunk.graphics.Image;
class Visible {
public var image(default, default) : Graphic;
public function new() {
this.image = Image.createRect(16, 16, 0xFF0000);
}
}
And this is the associated rendering system;
package game.system;
import ash.core.Engine;
import ash.core.Entity;
import ash.core.System;
import ash.tools.ListIteratingSystem;
import com.haxepunk.HXP;
import Constants;
import game.component.Positionable;
import game.component.Visible;
import game.node.RenderNode;
class RenderingSystem extends ListIteratingSystem<RenderNode> {
public function new() {
super(RenderNode, this.updateNode);
}
private function updateNode(node:RenderNode, time:Float) : Void {
node.renderable.image.render(HXP.buffer, node.position.position, Constants.ORIGIN);
}
}
Any tips?
If you are using buffer rendering in C++ you'll need to set the render mode inside the constructor. This is because the Engine constructor is the only place a screen buffer is created. Unfortunately the API docs don't clearly explain this.
class Main extends Engine
{
public function new()
{
super(0, 0, 60, false, RenderMode.BUFFER);
}
}