Is there a way to have chart.js display the datapoint as 0 if a data point is missing from a date-based dataset? For example, if I have a dataset with a null data value for a month, chart.js won't display a data point. Google charts seems to do this, but I would really prefer to stick with Chart.js if this functionality is there.
Thanks in advance.
No, there is no such function. chart.js only handles the input data. So it does not know what is the meaning behind the data. That is not possible. It takes it as it is - and that is good.
You have to write a function which checks if all data points you want to have are in your array and if not create them and initalize them with 0. Than handle the array over to chart.js.
I think you need something like this:
let needed_days_in_data = (end_time.getTime() - start_time.getTime()) / (1000 * 3600 * 24);
for (let i = 0; i < needed_days_in_data; i++) {
let date_check = new Date(start_time.getTime());
date_check.setDate(start_time.getDate() + i);
//HERE CHECK IF DATE IS ALREADY IN ARRAY
//IF DATE IS NOT IN ARRAY PUSH IT TO ARRAY WITH VALUE 0
}
yes in chart.js you can have values as zero like this
'''
var myChart = new Chart (ctx,{
type: 'bar',
data: {
labels: ['Red', 'Blue', 'Yellow', 'Green', 'Purple', 'Orange'],
datasets: [{
data: [20, 0, 0, 0, 3, 9]
}]
}
});
'''
Related
I'm developing a scatter graph to visualize the trend of product volume for every product count.
x: product count
y: product volume
From django view, two array list was pass to chart javascript in django template. Here is the code:
<script>
const volume = {{product_volume}} // <--- array list of product_volume
const count = {{product_count}} // <--- array list of product_count
const p1ctx = document.getElementById('VolumeChart');
const P1Chart = new Chart(p1ctx, {
data: {
datasets: [{
type: 'scatter',
label: 'Volume',
data: volume, // <-------------------- Target to assign 'x' and 'y' data
backgroundColor: 'rgb(255, 99, 132)'
}],
},
.
.
.
</script>
However, with huge data (approx.: 15,000), how can I simplify the process of data labelling in the JavaScript using array list of product count and product volume?
According to the documentation for Chart.js the scatter chart accepts data as an array with {x: ..., y: ...} pairs.
To reshape the two arrays volume and count into that shape, you'd do
const data = count.map((x, index) => ({x, y: volume[index]}));
assuming they're the same length.
I'm trying to position a custom Chart.js tooltip on the middle of two bars. I made a research on the chart.js docs and different posts but didn't found any different approach than position it using the following snippet (I'm using jquery and doing a toggle class to hide/show the tooltip, please focus on the values I'm using to position the tooltip):
function setupTooltipPosition(config) {
const { tooltipElement, tooltipModel } = config;
const leftOffset = this._chart.canvas.offsetLeft;
const topOffset = this._chart.canvas.offsetTop;
tooltipElement.removeClass(tooltipSelectors.hide);
tooltipElement.css('left', `${tooltipModel.caretX - leftOffset}px`);
tooltipElement.css('top', `${tooltipModel.carteY - topOffset}px`);
}
Using this approach on this particular case I got this result:
[
I also tried using the x and y values of the tooltipModel , however , it doesn't add more sense of how to put the tooltip in the middle. If you tried to add a custom offset to put it on the middle , it works for just this case, but when you add more data it doesn't work. I can have at most 2 datasets, however, the user can add at most 10 data values that means 10 groups of bars.
The question is: There is a way to put the tooltip in the middle of the group of bars (or a single bar in case the user has just 1 dataset) using the values that Chart.js provide for tooltips? Or in other case, do you know any custom approach to have this tooltip centered?
Thanks!!
You don't specify if 'middle' means on the x- and/or y-axis. But since you wrote:
...on the middle of two bars...
I assume you mean on the x-axis. In this case you need to set the options.tooltips.mode property to index. Below is a working example.
let myChart = new Chart(document.getElementById('myChart'), {
type: 'bar',
data: {
labels: ['x', 'y', 'z'],
datasets: [{
label: '1993',
data: [50, 30, 65],
backgroundColor: '#503291'
}, {
label: '1994',
data: [60, 15, 65],
backgroundColor: '#1bbecd'
}]
},
options: {
tooltips: {
mode: 'index'
},
scales: {
yAxes: [{
ticks: {
beginAtZero: true
}
}],
xAxes: [{
barPercentage: 1
}]
}
}
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/Chart.js/2.7.2/Chart.min.js"></script>
<canvas id="myChart"></canvas>
For future questions please provide a MCVE as that makes it much quicker and easier to help you.
I've found numerous posts on how to gradient fill the area beneath the chart, but I'd like to do this:
Is that doable with ChartJS?
It is somehow doable. A simple approach presented below assumes one dataset only (it should be easy to extend the approach for handling more datasets, though). The idea is as follows. We will create a plugin that will override the beforeUpdate method (which is called at the start of every update). At the start of every update, the exact Y pixels of the min and max values of the dataset are calculated. A vertical linear gradient is then created from the context of the canvas using createLinearGradient, with a kind of red for the Y pixel that corresponds to the min value of the dataset and a jazzy kind of blue for the Y pixel that corresponds to the max value of the dataset. Look at the commented code for more information. There may be some glitches regarding hovering over points and legend coloring, which I am not very keen on looking into. A working fiddle is here and the code is also available below.
var gradientLinePlugin = {
// Called at start of update.
beforeUpdate: function(chartInstance) {
if (chartInstance.options.linearGradientLine) {
// The context, needed for the creation of the linear gradient.
var ctx = chartInstance.chart.ctx;
// The first (and, assuming, only) dataset.
var dataset = chartInstance.data.datasets[0];
// Calculate min and max values of the dataset.
var minValue = Number.MAX_VALUE;
var maxValue = Number.MIN_VALUE;
for (var i = 0; i < dataset.data.length; ++i) {
if (minValue > dataset.data[i])
minValue = dataset.data[i];
if (maxValue < dataset.data[i])
maxValue = dataset.data[i];
}
// Calculate Y pixels for min and max values.
var yAxis = chartInstance.scales['y-axis-0'];
var minValueYPixel = yAxis.getPixelForValue(minValue);
var maxValueYPixel = yAxis.getPixelForValue(maxValue);
// Create the gradient.
var gradient = ctx.createLinearGradient(0, minValueYPixel, 0, maxValueYPixel);
// A kind of red for min.
gradient.addColorStop(0, 'rgba(231, 18, 143, 1.0)');
// A kind of blue for max.
gradient.addColorStop(1, 'rgba(0, 173, 238, 1.0)');
// Assign the gradient to the dataset's border color.
dataset.borderColor = gradient;
// Uncomment this for some effects, especially together with commenting the `fill: false` option below.
// dataset.backgroundColor = gradient;
}
}
};
Chart.pluginService.register(gradientLinePlugin);
var ctx = document.getElementById("myChart");
var myChart = new Chart(ctx, {
type: 'line',
data: {
labels: ["First", "Second", "Third", "Fourth", "Fifth"],
datasets: [{
label: 'My Sample Dataset',
data: [20, 30, 50, 10, 40],
// No curves.
tension: 0,
// No fill under the line.
fill: false
}],
},
options: {
// Option for coloring the line with a gradient.
linearGradientLine: true,
scales: {
yAxes: [{
ticks: {
min: 0,
max: 100,
stepSize: 20
}
}]
}
}
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/Chart.js/2.4.0/Chart.min.js"></script>
<canvas id="myChart" width="400" height="200"></canvas>
There is also a pluginless method, mentioned here, but that method is lacking. According to that method, one would have to set the borderColor to a gradient that should have been created before the creation of the chart. The gradient is calculated statically and will never fit an arbitrary range or respond to resizing as is.
I have a chartjs linechart diagram to show the sales of different products on a range of dates. The user can select a date range (for example from 2015-12-01 to 2015-12-10) to view the sales per day and thats fine and its working.
But if the user selects only one day (range from for example 2015-12-01 to 2015-12-01), he gets the correct diagram, but it doesn't look good:
As you can see, the points are stick to the y-axis. Is there a possibility, to center the points on the diagram?
Thats how it should look like:
Instead of hardcoding the labels and values with blank parameters, use the offset property.
const options = {
scales: {
x: {
offset: true
}
}
}
Documentation: https://www.chartjs.org/docs/latest/axes/cartesian/linear.html#common-options-to-all-cartesian-axes
You can check the length of your labels (or data) arrays and add dummy non-renderable points to the left and right by using empty string labels and null value, like so
var chartData = {
labels: ['', "A", ''],
datasets: [
{
fillColor: "rgba(255, 52, 21, 0.2)",
pointColor: "#da3e2f",
strokeColor: "#da3e2f",
data: [null, 20, null]
},
{
fillColor: "rgba(52, 21, 255, 0.2)",
strokeColor: "#1C57A8",
pointColor: "#1C57A8",
data: [null, 30, null]
},
]
}
Fiddle - https://jsfiddle.net/pf24vg16/
Wanted to add to the above answer and say that I got a similar effect on a time series scatter plot using this:
if (values.length === 1) {
const arrCopy = Object.assign({}, values);
values.unshift({x: arrCopy[0].x - 86400000, y: null});
values.push({x: arrCopy[0].x + 2 * 86400000, y: null});
}
That only handles for a single point, however. To add in functionality for multiple points, I did the following:
const whether = (array) => {
const len = array.length;
let isSame = false;
for (let i = 1; i < len; i++) {
if (array[0].x - array[i].x >= 43200000) {
isSame = false;
break;
} else {
isSame = true;
}
}
return isSame;
}
if (values.length === 1 || whether(arr[0])) {
const arrCopy = Object.assign({}, values);
values.unshift({x: arrCopy[0].x - 86400000, y: null});
values.push({x: arrCopy[0].x + 2 * 86400000, y: null});
}
You might notice I'm just subtracting/adding a day in milliseconds into the x values. To be honest, I was just having the worst of times with moment.js and gave up haha. Hope this helps someone else!
Note: my code has a tolerance of 43200000, or 12 hours, on the time. You could use moment.js to compare days if you have better luck with it than I did tonight :)
For your specific problem, try to modify the options->scales->xAxes option like so:
options: {
title: {
display: true,
text: 'mytitle1'
},
scales: {
xAxes: [{
type: 'linear',
ticks: {
suggestedMin: 0,
suggestedMax: (11.12*2),
stepSize: 1 //interval between ticks
}
}],
More info at: Chart JS: Ignoring x values and putting point data on first available labels
Consider drawing a column chart and I don't get any data from the data source, How do we draw an empty chart instead of showing up a red colored default message saying "Table has no columns"?
What I do is initialize my chart with 1 column and 1 data point (set to 0). Then whenever data gets added I check if there is only 1 column and that it is the dummy column, then I remove it. I also hide the legend to begin so that it doesn't appear with the dummy column, then I add it when the new column gets added.
Here is some sample code you can plug in to the Google Visualization Playground that does what I am talking about. You should see the empty chart for 2 seconds, then data will get added and the columns will appear.
var data, options, chart;
function drawVisualization() {
data = google.visualization.arrayToDataTable([
['Time', 'dummy'],
['', 0],
]);
options = {
title:"My Chart",
width:600, height:400,
hAxis: {title: "Time"},
legend : {position: 'none'}
};
// Create and draw the visualization.
chart = new google.visualization.ColumnChart(document.getElementById('visualization'));
chart.draw(data,options);
setTimeout('addData("12:00",10)',2000);
setTimeout('addData("12:10",20)',3000);
}
function addData(x,y) {
if(data.getColumnLabel(1) == 'dummy') {
data.addColumn('number', 'Your Values', 'col_id');
data.removeColumn(1);
options.legend = {position: 'right'};
}
data.addRow([x,y]);
chart.draw(data,options);
}
A even better solution for this problem might be to use a annotation column instead of a data column as shown below. With this solution you do not need to use any setTimeout or custom function to remove or hide your column. Give it a try by pasting the given code below into Google Code Playground.
function drawVisualization() {
var data = google.visualization.arrayToDataTable([
['', { role: 'annotation' }],
['', '']
]);
var ac = new google.visualization.ColumnChart(document.getElementById('visualization'));
ac.draw(data, {
title : 'Just a title...',
width: 600,
height: 400
});
}
The way I did this was by disabling the pie slices, turning off tooltips, stuffing in a pretend value and making it gray. I'm sure there are more clever ways to do this, but this worked for me where the other methods didn't.
The only drawback is that it sets both items in the legend to gray as well. I think you could perhaps just add a third item, and make it invisible on the legend only. I liked this way though.
function drawChart() {
// Define the chart to be drawn.
data = new google.visualization.DataTable();
data.addColumn({type: 'string', label: 'Result'});
data.addColumn({type: 'number', label: 'Count'});
data.addRows([
['Value A', 0],
['Value B', 0]
]);
var opt_pieslicetext = null;
var opt_tooltip_trigger = null;
var opt_color = null;
if (data.getValue(1,1) == 0 && data.getValue(0,1) == 0) {
opt_pieslicetext='none';
opt_tooltip_trigger='none'
data.setCell(1,1,.1);
opt_color= ['#D3D3D3'];
}
chart = new google.visualization.PieChart(document.getElementById('mydiv'));
chart.draw(data, {sliceVisibilityThreshold:0, pieSliceText: opt_pieslicetext, tooltip: { trigger: opt_tooltip_trigger }, colors: opt_color } );
}