I want to plot a date against a time, extracted from the same date: item.sunrise (ISO 8601). The compiler rejects the code below stating: Initializer init(x:y:) requires that DateComponents conform to Plottable.
Chart{
ForEach(sunriseSunsetDates) {item in
LineMark(
x: .value("Date", calendar.dateComponents([.year, .month, .day], from: item.sunrise)),
y: .value("Time", calendar.dateComponents([.hour, .minute, .second], from: item.sunrise))
)
.foregroundStyle(.blue.gradient)
.interpolationMethod(.catmullRom)
}
}
Can this be done or do I need to rewrite this in a different format?
To plot Dates, you can use the value func that takes a Date and Calendar.Component as parameters:
Chart{
ForEach(sunriseSunsetDates) {item in
LineMark(
x: .value("Date", item.sunrise, unit: .day),
y: .value("Time", item.sunrise, unit: .minute)
)
}
}
Related
I have a chart which displays data over time (months in my case). The data for this chart is randomly generated by getting any date from now to 500 days in the past (chartDate), and a random number (sales). It generates 500 of these rows.
func dailySales2() -> [(chartDate: Date, year: String, sales: Int)] {
var temp: [(chartDate: Date, year: String, sales: Int)] = []
for _ in 1...500 {
let dateT = Calendar.current.date(byAdding: .day, value: -Int.random(in: 1...500), to: Date())!
temp.append((chartDate: dateT, year: dateT.formatted(.dateTime.year()), sales: Int.random(in: 100...500)))
}
return temp.sorted(by: { $0.chartDate < $1.chartDate })
}
Chart {
ForEach(dailySales2(), id: \.chartDate) { chartD in
BarMark(
x: .value("Day", chartD.chartDate, unit: .month),
y: .value("Sales", chartD.sales)
)
.foregroundStyle(by: .value("Year:", chartD.year))
}
}
The above takes all the rows for each month and totals them, so each bar is a sum of all the sales for each month. If I change this to a LineMark, it displays each row individually, how can I stop this and instead sum all the individual rows into one point for each month? Thanks.
I'm expecting each month column to display a single point which sums all the rows that contains data for that month.
I have a chart which displays data over time (months in my case). The data for this chart is randomly generated by getting any date from now to 500 days in the past (chartDate), and a random number (sales). It generates 500 of these rows.
func dailySales2() -> [(chartDate: Date, year: String, sales: Int)] {
var temp: [(chartDate: Date, year: String, sales: Int)] = []
for _ in 1...500 {
let dateT = Calendar.current.date(byAdding: .day, value: -Int.random(in: 1...500), to: Date())!
temp.append((chartDate: dateT, year: dateT.formatted(.dateTime.year()), sales: Int.random(in: 100...500)))
}
return temp.sorted(by: { $0.chartDate < $1.chartDate })
}
Chart {
ForEach(dailySales2(), id: \.chartDate) { chartD in
BarMark(
x: .value("Day", chartD.chartDate, unit: .month),
y: .value("Sales", chartD.sales)
)
.foregroundStyle(by: .value("Year:", chartD.year))
}
}
.chartXAxis {
AxisMarks(values: .stride(by: .month)) {
AxisValueLabel(format: .dateTime.month(.narrow), centered: true)
}
}
This generates A S O N D J F M A M J J A S O N D columns (August 2021 - December 2021 and then January 2022 - December 2022). How do I go about only having one set of January - December columns and putting both years into these columns (separated by series)? Thanks.
Having 12 columns, one for each month, which any year uses.
I've created a horizontal floating bar chart using ChartJS. The data I am passing in is formatted as:
[
{
rowName: 'Project 1',
startDate: '2021-03-15',
endDate: '2021-04-20',
}
]
Where my x axis shows a month/year and my y axis shows the rowName. I've added chartjs-adapater-date-fns but in order to get the floating bars to work, I've had to convert the startDate and endDate into new dates and then use the .getTime() function to retrieve a number for the data the chart expects. E.g. [new Date(startDate).getTime(), new Date(endDate).getTime()].
On my tooltip, it shows the label as rowName which is what I'm wanting, however the data value shows as the two number values being passed in.
I'm wanting to show the tooltip in the following format:
Project 1
Start Date: 05/03/2021
End Date: 20/04/2021
What is the best way of doing this?
Note: I have consoled the context and found that data.raw provides me with 2021-05-03,2021-04-20 if that is of any use?
Instead of new date pass your input date , Tooltip will show with formatted date value.
var barOption = {
tooltips: {
callbacks: {
label: function(t, d) {
this.date=new Date();
let formated_date = this.datepipe.transform(this.date, 'dd/MM/yyy');
return formated_date;
},
},
},
}
I'm trying to update code to swift 3 but I can't find anything about CGPathAddCurveToPoint, how can I fix the error?
path is a CGMutablePath
CGPathAddCurveToPoint(path, nil, 10, 10, 20, 20, 30, 30)
error: nil is not compatible with expected argument type 'UnsafePoiter'
I found that CGPathAddLineToPoint(path, nil, 10, 10)
became path.addLine(to: p0)
Please read the CGMutablePath docs:
https://developer.apple.com/reference/coregraphics/cgmutablepath
You will find:
func addCurve(to end: CGPoint, control1: CGPoint, control2: CGPoint,
transform: CGAffineTransform = default)
Note that this is now a method, not a loose global function.
I'm using this code piece in my Swift project.
The following code worked fine in Swift 2 but don't seem to work in Swift 3 anymore, it gives the following error:
Cannot assign value of type '(CGFloat, CGFloat, Int, Int)' to type 'CGRect'
if plusLayer == nil {
plusLayer = CALayer()
let halfWidth: CGFloat = self.bounds.size.width / 2
let halfHeight: CGFloat = self.bounds.size.height / 2
let ds: CGFloat = sqrt(halfWidth * halfWidth / 2)
let x: CGFloat = halfWidth + ds - 27 / 2
let y: CGFloat = halfHeight - ds - 27 / 2
plusLayer.frame = (x, y, 27, 27)
plusLayer.contentsGravity = kCAGravityResizeAspectFill
if let i = plusSignImage {
plusLayer.contents = i.cgImage
} else {
plusSignImage = UIImage(named: "PlusSign", in: Bundle(for: self.dynamicType), compatibleWith: UITraitCollection(displayScale: UIScreen.main().scale))
plusLayer.contents = plusSignImage!.cgImage
}
layer.addSublayer(plusLayer)
}
Anyone knows how I can solve this issue? Help'd be really appreciated!
Thanks! :D
This line plusLayer.frame = (x, y, 27, 27)
should be
plusLayer.frame = (x, y, 27 as CGFloat, 27 as CGFloat)
This is because the CGRect initializer expects 4 numbers with the same type. Take a look at the Xcode suggestions:
You need explicit casting of 27 (an Int type) to CGFloat:
plusLayer.frame = (x, y, 27 as CGFloat, 27 as CGFloat)
as an old developer, let make some clearness:
a) casting is a powerful features of some languages, but it must be known in details
b) we can cast using "as", but is NOT the same as doing CGFloat(xx)
c) "as" must be used for type casting of objects.
(see:
https://developer.apple.com/library/ios/documentation/Swift/Conceptual/Swift_Programming_Language/TypeCasting.html
)
d) CGRectMake does work in swift 2.x/xcode 7.3
e) don't cast to double, simply use:
let r = CGRect(x: 10, y: 20, width: 30, height: 40)
compiler will understand which of these method call:
public init(x: CGFloat, y: CGFloat, width: CGFloat, height: CGFloat)
public init(x: Double, y: Double, width: Double, height: Double)
public init(x: Int, y: Int, width: Int, height: Int)
and will call the 3rd.
If You write:
let d = 10.0
let r2 = CGRect(x: d, y: 20, width: 30, height: 40)
it will call 2nd, as d is a Double.
my two cents.