Browse Source

promql: Separate `Point` into `FPoint` and `HPoint`

In other words: Instead of having a “polymorphous” `Point` that can
either contain a float value or a histogram value, use an `FPoint` for
floats and an `HPoint` for histograms.

This seemingly small change has a _lot_ of repercussions throughout
the codebase.

The idea here is to avoid the increase in size of `Point` arrays that
happened after native histograms had been added.

The higher-level data structures (`Sample`, `Series`, etc.) are still
“polymorphous”. The same idea could be applied to them, but at each
step the trade-offs needed to be evaluated.

The idea with this change is to do the minimum necessary to get back
to pre-histogram performance for functions that do not touch
histograms. Here are comparisons for the `changes` function. The test
data doesn't include histograms yet. Ideally, there would be no change
in the benchmark result at all.

First runtime v2.39 compared to directly prior to this commit:

```
name                                                  old time/op    new time/op    delta
RangeQuery/expr=changes(a_one[1d]),steps=1-16            391µs ± 2%     542µs ± 1%  +38.58%  (p=0.000 n=9+8)
RangeQuery/expr=changes(a_one[1d]),steps=10-16           452µs ± 2%     617µs ± 2%  +36.48%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_one[1d]),steps=100-16         1.12ms ± 1%    1.36ms ± 2%  +21.58%  (p=0.000 n=8+10)
RangeQuery/expr=changes(a_one[1d]),steps=1000-16        7.83ms ± 1%    8.94ms ± 1%  +14.21%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_ten[1d]),steps=1-16           2.98ms ± 0%    3.30ms ± 1%  +10.67%  (p=0.000 n=9+10)
RangeQuery/expr=changes(a_ten[1d]),steps=10-16          3.66ms ± 1%    4.10ms ± 1%  +11.82%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_ten[1d]),steps=100-16         10.5ms ± 0%    11.8ms ± 1%  +12.50%  (p=0.000 n=8+10)
RangeQuery/expr=changes(a_ten[1d]),steps=1000-16        77.6ms ± 1%    87.4ms ± 1%  +12.63%  (p=0.000 n=9+9)
RangeQuery/expr=changes(a_hundred[1d]),steps=1-16       30.4ms ± 2%    32.8ms ± 1%   +8.01%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=10-16      37.1ms ± 2%    40.6ms ± 2%   +9.64%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=100-16      105ms ± 1%     117ms ± 1%  +11.69%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=1000-16     783ms ± 3%     876ms ± 1%  +11.83%  (p=0.000 n=9+10)
```

And then runtime v2.39 compared to after this commit:

```
name                                                  old time/op    new time/op    delta
RangeQuery/expr=changes(a_one[1d]),steps=1-16            391µs ± 2%     547µs ± 1%  +39.84%  (p=0.000 n=9+8)
RangeQuery/expr=changes(a_one[1d]),steps=10-16           452µs ± 2%     616µs ± 2%  +36.15%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_one[1d]),steps=100-16         1.12ms ± 1%    1.26ms ± 1%  +12.20%  (p=0.000 n=8+10)
RangeQuery/expr=changes(a_one[1d]),steps=1000-16        7.83ms ± 1%    7.95ms ± 1%   +1.59%  (p=0.000 n=10+8)
RangeQuery/expr=changes(a_ten[1d]),steps=1-16           2.98ms ± 0%    3.38ms ± 2%  +13.49%  (p=0.000 n=9+10)
RangeQuery/expr=changes(a_ten[1d]),steps=10-16          3.66ms ± 1%    4.02ms ± 1%   +9.80%  (p=0.000 n=10+9)
RangeQuery/expr=changes(a_ten[1d]),steps=100-16         10.5ms ± 0%    10.8ms ± 1%   +3.08%  (p=0.000 n=8+10)
RangeQuery/expr=changes(a_ten[1d]),steps=1000-16        77.6ms ± 1%    78.1ms ± 1%   +0.58%  (p=0.035 n=9+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=1-16       30.4ms ± 2%    33.5ms ± 4%  +10.18%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=10-16      37.1ms ± 2%    40.0ms ± 1%   +7.98%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=100-16      105ms ± 1%     107ms ± 1%   +1.92%  (p=0.000 n=10+10)
RangeQuery/expr=changes(a_hundred[1d]),steps=1000-16     783ms ± 3%     775ms ± 1%   -1.02%  (p=0.019 n=9+9)
```

In summary, the runtime doesn't really improve with this change for
queries with just a few steps. For queries with many steps, this
commit essentially reinstates the old performance. This is good
because the many-step queries are the one that matter most (longest
absolute runtime).

In terms of allocations, though, this commit doesn't make a dent at
all (numbers not shown). The reason is that most of the allocations
happen in the sampleRingIterator (in the storage package), which has
to be addressed in a separate commit.

Signed-off-by: beorn7 <beorn@grafana.com>
pull/11687/head
beorn7 2 years ago
parent
commit
c0879d64cf
  1. 5
      cmd/promtool/unittest.go
  2. 37
      docs/querying/functions.md
  3. 390
      promql/engine.go
  4. 141
      promql/engine_test.go
  5. 473
      promql/functions.go
  6. 2
      promql/functions_test.go
  7. 2
      promql/quantile.go
  8. 26
      promql/test.go
  9. 22
      promql/test_test.go
  10. 239
      promql/value.go
  11. 10
      rules/alerting.go
  12. 46
      rules/alerting_test.go
  13. 5
      rules/manager.go
  14. 40
      rules/manager_test.go
  15. 24
      rules/recording_test.go
  16. 2
      template/template.go
  17. 26
      template/template_test.go
  18. 79
      util/jsonutil/marshal.go
  19. 163
      web/api/v1/api.go
  20. 48
      web/api/v1/api_test.go
  21. 6
      web/federate.go
  22. 29
      web/federate_test.go

5
cmd/promtool/unittest.go

@ -347,7 +347,7 @@ Outer:
for _, s := range got {
gotSamples = append(gotSamples, parsedSample{
Labels: s.Metric.Copy(),
Value: s.V,
Value: s.F,
})
}
@ -447,7 +447,8 @@ func query(ctx context.Context, qs string, t time.Time, engine *promql.Engine, q
return v, nil
case promql.Scalar:
return promql.Vector{promql.Sample{
Point: promql.Point{T: v.T, V: v.V},
T: v.T,
F: v.V,
Metric: labels.Labels{},
}}, nil
default:

37
docs/querying/functions.md

@ -17,9 +17,7 @@ _Notes about the experimental native histograms:_
flag](../feature_flags/#native-histograms). As long as no native histograms
have been ingested into the TSDB, all functions will behave as usual.
* Functions that do not explicitly mention native histograms in their
documentation (see below) effectively treat a native histogram as a float
sample of value 0. (This is confusing and will change before native
histograms become a stable feature.)
documentation (see below) will ignore histogram samples.
* Functions that do already act on native histograms might still change their
behavior in the future.
* If a function requires the same bucket layout between multiple native
@ -404,6 +402,8 @@ For each timeseries in `v`, `label_join(v instant-vector, dst_label string, sepa
using `separator` and returns the timeseries with the label `dst_label` containing the joined value.
There can be any number of `src_labels` in this function.
`label_join` acts on float and histogram samples in the same way.
This example will return a vector with each time series having a `foo` label with the value `a,b,c` added to it:
```
@ -419,6 +419,8 @@ of `replacement`, together with the original labels in the input. Capturing grou
regular expression can be referenced with `$1`, `$2`, etc. If the regular expression doesn't
match then the timeseries is returned unchanged.
`label_replace` acts on float and histogram samples in the same way.
This example will return timeseries with the values `a:c` at label `service` and `a` at label `foo`:
```
@ -501,10 +503,21 @@ counter resets when your target restarts.
For each input time series, `resets(v range-vector)` returns the number of
counter resets within the provided time range as an instant vector. Any
decrease in the value between two consecutive samples is interpreted as a
counter reset.
`resets` should only be used with counters.
decrease in the value between two consecutive float samples is interpreted as a
counter reset. A reset in a native histogram is detected in a more complex way:
Any decrease in any bucket, including the zero bucket, or in the count of
observation constitutes a counter reset, but also the disappearance of any
previously populated bucket, an increase in bucket resolution, or a decrease of
the zero-bucket width.
`resets` should only be used with counters and counter-like native
histograms.
If the range vector contains a mix of float and histogram samples for the same
series, counter resets are detected separately and their numbers added up. The
change from a float to a histogram sample is _not_ considered a counter
reset. Each float sample is compared to the next float sample, and each
histogram is comprared to the next histogram.
## `round()`
@ -526,7 +539,7 @@ have exactly one element, `scalar` will return `NaN`.
## `sort()`
`sort(v instant-vector)` returns vector elements sorted by their sample values,
in ascending order.
in ascending order. Native histograms are sorted by their sum of observations.
## `sort_desc()`
@ -545,7 +558,8 @@ expression is to be evaluated.
## `timestamp()`
`timestamp(v instant-vector)` returns the timestamp of each of the samples of
the given vector as the number of seconds since January 1, 1970 UTC.
the given vector as the number of seconds since January 1, 1970 UTC. It also
works with histogram samples.
## `vector()`
@ -569,12 +583,15 @@ over time and return an instant vector with per-series aggregation results:
* `quantile_over_time(scalar, range-vector)`: the φ-quantile (0 ≤ φ ≤ 1) of the values in the specified interval.
* `stddev_over_time(range-vector)`: the population standard deviation of the values in the specified interval.
* `stdvar_over_time(range-vector)`: the population standard variance of the values in the specified interval.
* `last_over_time(range-vector)`: the most recent point value in specified interval.
* `last_over_time(range-vector)`: the most recent point value in the specified interval.
* `present_over_time(range-vector)`: the value 1 for any series in the specified interval.
Note that all values in the specified interval have the same weight in the
aggregation even if the values are not equally spaced throughout the interval.
`count_over_time`, `last_over_time`, and `present_over_time` handle native
histograms as expected. All other functions ignore histogram samples.
## Trigonometric Functions
The trigonometric functions work in radians:

390
promql/engine.go

@ -189,7 +189,8 @@ func (q *query) Cancel() {
// Close implements the Query interface.
func (q *query) Close() {
for _, s := range q.matrix {
putPointSlice(s.Points)
putFPointSlice(s.Floats)
putHPointSlice(s.Histograms)
}
}
@ -680,11 +681,15 @@ func (ng *Engine) execEvalStmt(ctx context.Context, query *query, s *parser.Eval
for i, s := range mat {
// Point might have a different timestamp, force it to the evaluation
// timestamp as that is when we ran the evaluation.
vector[i] = Sample{Metric: s.Metric, Point: Point{V: s.Points[0].V, H: s.Points[0].H, T: start}}
if len(s.Histograms) > 0 {
vector[i] = Sample{Metric: s.Metric, H: s.Histograms[0].H, T: start}
} else {
vector[i] = Sample{Metric: s.Metric, F: s.Floats[0].F, T: start}
}
}
return vector, warnings, nil
case parser.ValueTypeScalar:
return Scalar{V: mat[0].Points[0].V, T: start}, warnings, nil
return Scalar{V: mat[0].Floats[0].F, T: start}, warnings, nil
case parser.ValueTypeMatrix:
return mat, warnings, nil
default:
@ -940,9 +945,10 @@ type errWithWarnings struct {
func (e errWithWarnings) Error() string { return e.err.Error() }
// An evaluator evaluates given expressions over given fixed timestamps. It
// is attached to an engine through which it connects to a querier and reports
// errors. On timeout or cancellation of its context it terminates.
// An evaluator evaluates the given expressions over the given fixed
// timestamps. It is attached to an engine through which it connects to a
// querier and reports errors. On timeout or cancellation of its context it
// terminates.
type evaluator struct {
ctx context.Context
@ -1137,17 +1143,35 @@ func (ev *evaluator) rangeEval(prepSeries func(labels.Labels, *EvalSeriesHelper)
}
for si, series := range matrixes[i] {
for _, point := range series.Points {
for _, point := range series.Floats {
if point.T == ts {
if ev.currentSamples < ev.maxSamples {
vectors[i] = append(vectors[i], Sample{Metric: series.Metric, F: point.F, T: ts})
if prepSeries != nil {
bufHelpers[i] = append(bufHelpers[i], seriesHelpers[i][si])
}
// Move input vectors forward so we don't have to re-scan the same
// past points at the next step.
matrixes[i][si].Floats = series.Floats[1:]
ev.currentSamples++
} else {
ev.error(ErrTooManySamples(env))
}
}
break
}
for _, point := range series.Histograms {
if point.T == ts {
if ev.currentSamples < ev.maxSamples {
vectors[i] = append(vectors[i], Sample{Metric: series.Metric, Point: point})
vectors[i] = append(vectors[i], Sample{Metric: series.Metric, H: point.H, T: ts})
if prepSeries != nil {
bufHelpers[i] = append(bufHelpers[i], seriesHelpers[i][si])
}
// Move input vectors forward so we don't have to re-scan the same
// past points at the next step.
matrixes[i][si].Points = series.Points[1:]
matrixes[i][si].Histograms = series.Histograms[1:]
ev.currentSamples++
} else {
ev.error(ErrTooManySamples(env))
@ -1184,8 +1208,11 @@ func (ev *evaluator) rangeEval(prepSeries func(labels.Labels, *EvalSeriesHelper)
if ev.endTimestamp == ev.startTimestamp {
mat := make(Matrix, len(result))
for i, s := range result {
s.Point.T = ts
mat[i] = Series{Metric: s.Metric, Points: []Point{s.Point}}
if s.H == nil {
mat[i] = Series{Metric: s.Metric, Floats: []FPoint{{T: ts, F: s.F}}}
} else {
mat[i] = Series{Metric: s.Metric, Histograms: []HPoint{{T: ts, H: s.H}}}
}
}
ev.currentSamples = originalNumSamples + mat.TotalSamples()
ev.samplesStats.UpdatePeak(ev.currentSamples)
@ -1197,22 +1224,28 @@ func (ev *evaluator) rangeEval(prepSeries func(labels.Labels, *EvalSeriesHelper)
h := sample.Metric.Hash()
ss, ok := seriess[h]
if !ok {
ss = Series{
Metric: sample.Metric,
Points: getPointSlice(numSteps),
ss = Series{Metric: sample.Metric}
}
if sample.H == nil {
if ss.Floats == nil {
ss.Floats = getFPointSlice(numSteps)
}
ss.Floats = append(ss.Floats, FPoint{T: ts, F: sample.F})
} else {
if ss.Histograms == nil {
ss.Histograms = getHPointSlice(numSteps)
}
ss.Histograms = append(ss.Histograms, HPoint{T: ts, H: sample.H})
}
sample.Point.T = ts
ss.Points = append(ss.Points, sample.Point)
seriess[h] = ss
}
}
// Reuse the original point slices.
for _, m := range origMatrixes {
for _, s := range m {
putPointSlice(s.Points)
putFPointSlice(s.Floats)
putHPointSlice(s.Histograms)
}
}
// Assemble the output matrix. By the time we get here we know we don't have too many samples.
@ -1253,7 +1286,7 @@ func (ev *evaluator) evalSubquery(subq *parser.SubqueryExpr) (*parser.MatrixSele
}
totalSamples := 0
for _, s := range mat {
totalSamples += len(s.Points)
totalSamples += len(s.Floats) + len(s.Histograms)
vs.Series = append(vs.Series, NewStorageSeries(s))
}
return ms, totalSamples, ws
@ -1297,7 +1330,7 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
return ev.rangeEval(initSeries, func(v []parser.Value, sh [][]EvalSeriesHelper, enh *EvalNodeHelper) (Vector, storage.Warnings) {
var param float64
if e.Param != nil {
param = v[0].(Vector)[0].V
param = v[0].(Vector)[0].F
}
return ev.aggregation(e.Op, sortedGrouping, e.Without, param, v[1].(Vector), sh[1], enh), nil
}, e.Param, e.Expr)
@ -1396,7 +1429,8 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
stepRange = ev.interval
}
// Reuse objects across steps to save memory allocations.
points := getPointSlice(16)
var floats []FPoint
var histograms []HPoint
inMatrix := make(Matrix, 1)
inArgs[matrixArgIndex] = inMatrix
enh := &EvalNodeHelper{Out: make(Vector, 0, 1)}
@ -1404,8 +1438,13 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
it := storage.NewBuffer(selRange)
var chkIter chunkenc.Iterator
for i, s := range selVS.Series {
ev.currentSamples -= len(points)
points = points[:0]
ev.currentSamples -= len(floats) + len(histograms)
if floats != nil {
floats = floats[:0]
}
if histograms != nil {
histograms = histograms[:0]
}
chkIter = s.Iterator(chkIter)
it.Reset(chkIter)
metric := selVS.Series[i].Labels()
@ -1418,7 +1457,6 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
}
ss := Series{
Metric: metric,
Points: getPointSlice(numSteps),
}
inMatrix[0].Metric = selVS.Series[i].Labels()
for ts, step := ev.startTimestamp, -1; ts <= ev.endTimestamp; ts += ev.interval {
@ -1428,44 +1466,54 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
// when looking up the argument, as there will be no gaps.
for j := range e.Args {
if j != matrixArgIndex {
otherInArgs[j][0].V = otherArgs[j][0].Points[step].V
otherInArgs[j][0].F = otherArgs[j][0].Floats[step].F
}
}
maxt := ts - offset
mint := maxt - selRange
// Evaluate the matrix selector for this series for this step.
points = ev.matrixIterSlice(it, mint, maxt, points)
if len(points) == 0 {
floats, histograms = ev.matrixIterSlice(it, mint, maxt, floats, histograms)
if len(floats)+len(histograms) == 0 {
continue
}
inMatrix[0].Points = points
inMatrix[0].Floats = floats
inMatrix[0].Histograms = histograms
enh.Ts = ts
// Make the function call.
outVec := call(inArgs, e.Args, enh)
ev.samplesStats.IncrementSamplesAtStep(step, int64(len(points)))
ev.samplesStats.IncrementSamplesAtStep(step, int64(len(floats)+len(histograms)))
enh.Out = outVec[:0]
if len(outVec) > 0 {
ss.Points = append(ss.Points, Point{V: outVec[0].Point.V, H: outVec[0].Point.H, T: ts})
if outVec[0].H == nil {
if ss.Floats == nil {
ss.Floats = getFPointSlice(numSteps)
}
ss.Floats = append(ss.Floats, FPoint{F: outVec[0].F, T: ts})
} else {
if ss.Histograms == nil {
ss.Histograms = getHPointSlice(numSteps)
}
ss.Histograms = append(ss.Histograms, HPoint{H: outVec[0].H, T: ts})
}
}
// Only buffer stepRange milliseconds from the second step on.
it.ReduceDelta(stepRange)
}
if len(ss.Points) > 0 {
if ev.currentSamples+len(ss.Points) <= ev.maxSamples {
if len(ss.Floats)+len(ss.Histograms) > 0 {
if ev.currentSamples+len(ss.Floats)+len(ss.Histograms) <= ev.maxSamples {
mat = append(mat, ss)
ev.currentSamples += len(ss.Points)
ev.currentSamples += len(ss.Floats) + len(ss.Histograms)
} else {
ev.error(ErrTooManySamples(env))
}
} else {
putPointSlice(ss.Points)
}
ev.samplesStats.UpdatePeak(ev.currentSamples)
}
ev.samplesStats.UpdatePeak(ev.currentSamples)
ev.currentSamples -= len(points)
putPointSlice(points)
ev.currentSamples -= len(floats) + len(histograms)
putFPointSlice(floats)
putHPointSlice(histograms)
// The absent_over_time function returns 0 or 1 series. So far, the matrix
// contains multiple series. The following code will create a new series
@ -1474,7 +1522,7 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
steps := int(1 + (ev.endTimestamp-ev.startTimestamp)/ev.interval)
// Iterate once to look for a complete series.
for _, s := range mat {
if len(s.Points) == steps {
if len(s.Floats)+len(s.Histograms) == steps {
return Matrix{}, warnings
}
}
@ -1482,7 +1530,10 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
found := map[int64]struct{}{}
for i, s := range mat {
for _, p := range s.Points {
for _, p := range s.Floats {
found[p.T] = struct{}{}
}
for _, p := range s.Histograms {
found[p.T] = struct{}{}
}
if i > 0 && len(found) == steps {
@ -1490,17 +1541,17 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
}
}
newp := make([]Point, 0, steps-len(found))
newp := make([]FPoint, 0, steps-len(found))
for ts := ev.startTimestamp; ts <= ev.endTimestamp; ts += ev.interval {
if _, ok := found[ts]; !ok {
newp = append(newp, Point{T: ts, V: 1})
newp = append(newp, FPoint{T: ts, F: 1})
}
}
return Matrix{
Series{
Metric: createLabelsForAbsentFunction(e.Args[0]),
Points: newp,
Floats: newp,
},
}, warnings
}
@ -1520,8 +1571,8 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
if e.Op == parser.SUB {
for i := range mat {
mat[i].Metric = dropMetricName(mat[i].Metric)
for j := range mat[i].Points {
mat[i].Points[j].V = -mat[i].Points[j].V
for j := range mat[i].Floats {
mat[i].Floats[j].F = -mat[i].Floats[j].F
}
}
if mat.ContainsSameLabelset() {
@ -1534,8 +1585,8 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
switch lt, rt := e.LHS.Type(), e.RHS.Type(); {
case lt == parser.ValueTypeScalar && rt == parser.ValueTypeScalar:
return ev.rangeEval(nil, func(v []parser.Value, _ [][]EvalSeriesHelper, enh *EvalNodeHelper) (Vector, storage.Warnings) {
val := scalarBinop(e.Op, v[0].(Vector)[0].Point.V, v[1].(Vector)[0].Point.V)
return append(enh.Out, Sample{Point: Point{V: val}}), nil
val := scalarBinop(e.Op, v[0].(Vector)[0].F, v[1].(Vector)[0].F)
return append(enh.Out, Sample{F: val}), nil
}, e.LHS, e.RHS)
case lt == parser.ValueTypeVector && rt == parser.ValueTypeVector:
// Function to compute the join signature for each series.
@ -1565,18 +1616,18 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
case lt == parser.ValueTypeVector && rt == parser.ValueTypeScalar:
return ev.rangeEval(nil, func(v []parser.Value, _ [][]EvalSeriesHelper, enh *EvalNodeHelper) (Vector, storage.Warnings) {
return ev.VectorscalarBinop(e.Op, v[0].(Vector), Scalar{V: v[1].(Vector)[0].Point.V}, false, e.ReturnBool, enh), nil
return ev.VectorscalarBinop(e.Op, v[0].(Vector), Scalar{V: v[1].(Vector)[0].F}, false, e.ReturnBool, enh), nil
}, e.LHS, e.RHS)
case lt == parser.ValueTypeScalar && rt == parser.ValueTypeVector:
return ev.rangeEval(nil, func(v []parser.Value, _ [][]EvalSeriesHelper, enh *EvalNodeHelper) (Vector, storage.Warnings) {
return ev.VectorscalarBinop(e.Op, v[1].(Vector), Scalar{V: v[0].(Vector)[0].Point.V}, true, e.ReturnBool, enh), nil
return ev.VectorscalarBinop(e.Op, v[1].(Vector), Scalar{V: v[0].(Vector)[0].F}, true, e.ReturnBool, enh), nil
}, e.LHS, e.RHS)
}
case *parser.NumberLiteral:
return ev.rangeEval(nil, func(v []parser.Value, _ [][]EvalSeriesHelper, enh *EvalNodeHelper) (Vector, storage.Warnings) {
return append(enh.Out, Sample{Point: Point{V: e.Val}, Metric: labels.EmptyLabels()}), nil
return append(enh.Out, Sample{F: e.Val, Metric: labels.EmptyLabels()}), nil
})
case *parser.StringLiteral:
@ -1595,15 +1646,24 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
it.Reset(chkIter)
ss := Series{
Metric: e.Series[i].Labels(),
Points: getPointSlice(numSteps),
}
for ts, step := ev.startTimestamp, -1; ts <= ev.endTimestamp; ts += ev.interval {
step++
_, v, h, ok := ev.vectorSelectorSingle(it, e, ts)
_, f, h, ok := ev.vectorSelectorSingle(it, e, ts)
if ok {
if ev.currentSamples < ev.maxSamples {
ss.Points = append(ss.Points, Point{V: v, H: h, T: ts})
if h == nil {
if ss.Floats == nil {
ss.Floats = getFPointSlice(numSteps)
}
ss.Floats = append(ss.Floats, FPoint{F: f, T: ts})
} else {
if ss.Histograms == nil {
ss.Histograms = getHPointSlice(numSteps)
}
ss.Histograms = append(ss.Histograms, HPoint{H: h, T: ts})
}
ev.samplesStats.IncrementSamplesAtStep(step, 1)
ev.currentSamples++
} else {
@ -1612,10 +1672,8 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
}
}
if len(ss.Points) > 0 {
if len(ss.Floats)+len(ss.Histograms) > 0 {
mat = append(mat, ss)
} else {
putPointSlice(ss.Points)
}
}
ev.samplesStats.UpdatePeak(ev.currentSamples)
@ -1706,15 +1764,21 @@ func (ev *evaluator) eval(expr parser.Expr) (parser.Value, storage.Warnings) {
panic(fmt.Errorf("unexpected result in StepInvariantExpr evaluation: %T", expr))
}
for i := range mat {
if len(mat[i].Points) != 1 {
if len(mat[i].Floats)+len(mat[i].Histograms) != 1 {
panic(fmt.Errorf("unexpected number of samples"))
}
for ts := ev.startTimestamp + ev.interval; ts <= ev.endTimestamp; ts = ts + ev.interval {
mat[i].Points = append(mat[i].Points, Point{
T: ts,
V: mat[i].Points[0].V,
H: mat[i].Points[0].H,
})
if len(mat[i].Floats) > 0 {
mat[i].Floats = append(mat[i].Floats, FPoint{
T: ts,
F: mat[i].Floats[0].F,
})
} else {
mat[i].Histograms = append(mat[i].Histograms, HPoint{
T: ts,
H: mat[i].Histograms[0].H,
})
}
ev.currentSamples++
if ev.currentSamples > ev.maxSamples {
ev.error(ErrTooManySamples(env))
@ -1741,11 +1805,13 @@ func (ev *evaluator) vectorSelector(node *parser.VectorSelector, ts int64) (Vect
chkIter = s.Iterator(chkIter)
it.Reset(chkIter)
t, v, h, ok := ev.vectorSelectorSingle(it, node, ts)
t, f, h, ok := ev.vectorSelectorSingle(it, node, ts)
if ok {
vec = append(vec, Sample{
Metric: node.Series[i].Labels(),
Point: Point{V: v, H: h, T: t},
T: t,
F: f,
H: h,
})
ev.currentSamples++
@ -1795,17 +1861,31 @@ func (ev *evaluator) vectorSelectorSingle(it *storage.MemoizedSeriesIterator, no
return t, v, h, true
}
var pointPool zeropool.Pool[[]Point]
var (
fPointPool zeropool.Pool[[]FPoint]
hPointPool zeropool.Pool[[]HPoint]
)
func getPointSlice(sz int) []Point {
if p := pointPool.Get(); p != nil {
func getFPointSlice(sz int) []FPoint {
if p := fPointPool.Get(); p != nil {
return p
}
return make([]Point, 0, sz)
return make([]FPoint, 0, sz)
}
func putPointSlice(p []Point) {
pointPool.Put(p[:0])
func putFPointSlice(p []FPoint) {
fPointPool.Put(p[:0])
}
func getHPointSlice(sz int) []HPoint {
if p := hPointPool.Get(); p != nil {
return p
}
return make([]HPoint, 0, sz)
}
func putHPointSlice(p []HPoint) {
hPointPool.Put(p[:0])
}
// matrixSelector evaluates a *parser.MatrixSelector expression.
@ -1837,13 +1917,15 @@ func (ev *evaluator) matrixSelector(node *parser.MatrixSelector) (Matrix, storag
Metric: series[i].Labels(),
}
ss.Points = ev.matrixIterSlice(it, mint, maxt, getPointSlice(16))
ev.samplesStats.IncrementSamplesAtTimestamp(ev.startTimestamp, int64(len(ss.Points)))
ss.Floats, ss.Histograms = ev.matrixIterSlice(it, mint, maxt, nil, nil)
totalLen := int64(len(ss.Floats)) + int64(len(ss.Histograms))
ev.samplesStats.IncrementSamplesAtTimestamp(ev.startTimestamp, totalLen)
if len(ss.Points) > 0 {
if totalLen > 0 {
matrix = append(matrix, ss)
} else {
putPointSlice(ss.Points)
putFPointSlice(ss.Floats)
putHPointSlice(ss.Histograms)
}
}
return matrix, ws
@ -1857,24 +1939,54 @@ func (ev *evaluator) matrixSelector(node *parser.MatrixSelector) (Matrix, storag
// values). Any such points falling before mint are discarded; points that fall
// into the [mint, maxt] range are retained; only points with later timestamps
// are populated from the iterator.
func (ev *evaluator) matrixIterSlice(it *storage.BufferedSeriesIterator, mint, maxt int64, out []Point) []Point {
if len(out) > 0 && out[len(out)-1].T >= mint {
func (ev *evaluator) matrixIterSlice(
it *storage.BufferedSeriesIterator, mint, maxt int64,
floats []FPoint, histograms []HPoint,
) ([]FPoint, []HPoint) {
mintFloats, mintHistograms := mint, mint
// First floats...
if len(floats) > 0 && floats[len(floats)-1].T >= mint {
// There is an overlap between previous and current ranges, retain common
// points. In most such cases:
// (a) the overlap is significantly larger than the eval step; and/or
// (b) the number of samples is relatively small.
// so a linear search will be as fast as a binary search.
var drop int
for drop = 0; floats[drop].T < mint; drop++ {
}
ev.currentSamples -= drop
copy(floats, floats[drop:])
floats = floats[:len(floats)-drop]
// Only append points with timestamps after the last timestamp we have.
mintFloats = floats[len(floats)-1].T + 1
} else {
ev.currentSamples -= len(floats)
if floats != nil {
floats = floats[:0]
}
}
// ...then the same for histograms. TODO(beorn7): Use generics?
if len(histograms) > 0 && histograms[len(histograms)-1].T >= mint {
// There is an overlap between previous and current ranges, retain common
// points. In most such cases:
// (a) the overlap is significantly larger than the eval step; and/or
// (b) the number of samples is relatively small.
// so a linear search will be as fast as a binary search.
var drop int
for drop = 0; out[drop].T < mint; drop++ {
for drop = 0; histograms[drop].T < mint; drop++ {
}
ev.currentSamples -= drop
copy(out, out[drop:])
out = out[:len(out)-drop]
copy(histograms, histograms[drop:])
histograms = histograms[:len(histograms)-drop]
// Only append points with timestamps after the last timestamp we have.
mint = out[len(out)-1].T + 1
mintHistograms = histograms[len(histograms)-1].T + 1
} else {
ev.currentSamples -= len(out)
out = out[:0]
ev.currentSamples -= len(histograms)
if histograms != nil {
histograms = histograms[:0]
}
}
soughtValueType := it.Seek(maxt)
@ -1896,25 +2008,31 @@ loop:
continue loop
}
// Values in the buffer are guaranteed to be smaller than maxt.
if t >= mint {
if t >= mintHistograms {
if ev.currentSamples >= ev.maxSamples {
ev.error(ErrTooManySamples(env))
}
ev.currentSamples++
out = append(out, Point{T: t, H: h})
if histograms == nil {
histograms = getHPointSlice(16)
}
histograms = append(histograms, HPoint{T: t, H: h})
}
case chunkenc.ValFloat:
t, v := buf.At()
if value.IsStaleNaN(v) {
t, f := buf.At()
if value.IsStaleNaN(f) {
continue loop
}
// Values in the buffer are guaranteed to be smaller than maxt.
if t >= mint {
if t >= mintFloats {
if ev.currentSamples >= ev.maxSamples {
ev.error(ErrTooManySamples(env))
}
ev.currentSamples++
out = append(out, Point{T: t, V: v})
if floats == nil {
floats = getFPointSlice(16)
}
floats = append(floats, FPoint{T: t, F: f})
}
}
}
@ -1926,21 +2044,27 @@ loop:
if ev.currentSamples >= ev.maxSamples {
ev.error(ErrTooManySamples(env))
}
out = append(out, Point{T: t, H: h})
if histograms == nil {
histograms = getHPointSlice(16)
}
histograms = append(histograms, HPoint{T: t, H: h})
ev.currentSamples++
}
case chunkenc.ValFloat:
t, v := it.At()
if t == maxt && !value.IsStaleNaN(v) {
t, f := it.At()
if t == maxt && !value.IsStaleNaN(f) {
if ev.currentSamples >= ev.maxSamples {
ev.error(ErrTooManySamples(env))
}
out = append(out, Point{T: t, V: v})
if floats == nil {
floats = getFPointSlice(16)
}
floats = append(floats, FPoint{T: t, F: f})
ev.currentSamples++
}
}
ev.samplesStats.UpdatePeak(ev.currentSamples)
return out
return floats, histograms
}
func (ev *evaluator) VectorAnd(lhs, rhs Vector, matching *parser.VectorMatching, lhsh, rhsh []EvalSeriesHelper, enh *EvalNodeHelper) Vector {
@ -2086,18 +2210,18 @@ func (ev *evaluator) VectorBinop(op parser.ItemType, lhs, rhs Vector, matching *
}
// Account for potentially swapped sidedness.
vl, vr := ls.V, rs.V
fl, fr := ls.F, rs.F
hl, hr := ls.H, rs.H
if matching.Card == parser.CardOneToMany {
vl, vr = vr, vl
fl, fr = fr, fl
hl, hr = hr, hl
}
value, histogramValue, keep := vectorElemBinop(op, vl, vr, hl, hr)
floatValue, histogramValue, keep := vectorElemBinop(op, fl, fr, hl, hr)
if returnBool {
if keep {
value = 1.0
floatValue = 1.0
} else {
value = 0.0
floatValue = 0.0
}
} else if !keep {
continue
@ -2131,7 +2255,8 @@ func (ev *evaluator) VectorBinop(op parser.ItemType, lhs, rhs Vector, matching *
// Both lhs and rhs are of same type.
enh.Out = append(enh.Out, Sample{
Metric: metric,
Point: Point{V: value, H: histogramValue},
F: floatValue,
H: histogramValue,
})
}
}
@ -2200,7 +2325,7 @@ func resultMetric(lhs, rhs labels.Labels, op parser.ItemType, matching *parser.V
// VectorscalarBinop evaluates a binary operation between a Vector and a Scalar.
func (ev *evaluator) VectorscalarBinop(op parser.ItemType, lhs Vector, rhs Scalar, swap, returnBool bool, enh *EvalNodeHelper) Vector {
for _, lhsSample := range lhs {
lv, rv := lhsSample.V, rhs.V
lv, rv := lhsSample.F, rhs.V
// lhs always contains the Vector. If the original position was different
// swap for calculating the value.
if swap {
@ -2221,7 +2346,7 @@ func (ev *evaluator) VectorscalarBinop(op parser.ItemType, lhs Vector, rhs Scala
keep = true
}
if keep {
lhsSample.V = value
lhsSample.F = value
if shouldDropMetricName(op) || returnBool {
lhsSample.Metric = enh.DropMetricName(lhsSample.Metric)
}
@ -2313,7 +2438,7 @@ type groupedAggregation struct {
hasFloat bool // Has at least 1 float64 sample aggregated.
hasHistogram bool // Has at least 1 histogram sample aggregated.
labels labels.Labels
value float64
floatValue float64
histogramValue *histogram.FloatHistogram
mean float64
groupCount int
@ -2365,7 +2490,7 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
if op == parser.COUNT_VALUES {
enh.resetBuilder(metric)
enh.lb.Set(valueLabel, strconv.FormatFloat(s.V, 'f', -1, 64))
enh.lb.Set(valueLabel, strconv.FormatFloat(s.F, 'f', -1, 64))
metric = enh.lb.Labels()
// We've changed the metric so we have to recompute the grouping key.
@ -2397,8 +2522,8 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
}
newAgg := &groupedAggregation{
labels: m,
value: s.V,
mean: s.V,
floatValue: s.F,
mean: s.F,
groupCount: 1,
}
if s.H == nil {
@ -2420,21 +2545,21 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
}
switch op {
case parser.STDVAR, parser.STDDEV:
result[groupingKey].value = 0
result[groupingKey].floatValue = 0
case parser.TOPK, parser.QUANTILE:
result[groupingKey].heap = make(vectorByValueHeap, 1, resultSize)
result[groupingKey].heap[0] = Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
}
case parser.BOTTOMK:
result[groupingKey].reverseHeap = make(vectorByReverseValueHeap, 1, resultSize)
result[groupingKey].reverseHeap[0] = Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
}
case parser.GROUP:
result[groupingKey].value = 1
result[groupingKey].floatValue = 1
}
continue
}
@ -2459,19 +2584,19 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
// point in copying the histogram in that case.
} else {
group.hasFloat = true
group.value += s.V
group.floatValue += s.F
}
case parser.AVG:
group.groupCount++
if math.IsInf(group.mean, 0) {
if math.IsInf(s.V, 0) && (group.mean > 0) == (s.V > 0) {
if math.IsInf(s.F, 0) && (group.mean > 0) == (s.F > 0) {
// The `mean` and `s.V` values are `Inf` of the same sign. They
// can't be subtracted, but the value of `mean` is correct
// already.
break
}
if !math.IsInf(s.V, 0) && !math.IsNaN(s.V) {
if !math.IsInf(s.F, 0) && !math.IsNaN(s.F) {
// At this stage, the mean is an infinite. If the added
// value is neither an Inf or a Nan, we can keep that mean
// value.
@ -2482,19 +2607,19 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
}
}
// Divide each side of the `-` by `group.groupCount` to avoid float64 overflows.
group.mean += s.V/float64(group.groupCount) - group.mean/float64(group.groupCount)
group.mean += s.F/float64(group.groupCount) - group.mean/float64(group.groupCount)
case parser.GROUP:
// Do nothing. Required to avoid the panic in `default:` below.
case parser.MAX:
if group.value < s.V || math.IsNaN(group.value) {
group.value = s.V
if group.floatValue < s.F || math.IsNaN(group.floatValue) {
group.floatValue = s.F
}
case parser.MIN:
if group.value > s.V || math.IsNaN(group.value) {
group.value = s.V
if group.floatValue > s.F || math.IsNaN(group.floatValue) {
group.floatValue = s.F
}
case parser.COUNT, parser.COUNT_VALUES:
@ -2502,21 +2627,21 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
case parser.STDVAR, parser.STDDEV:
group.groupCount++
delta := s.V - group.mean
delta := s.F - group.mean
group.mean += delta / float64(group.groupCount)
group.value += delta * (s.V - group.mean)
group.floatValue += delta * (s.F - group.mean)
case parser.TOPK:
// We build a heap of up to k elements, with the smallest element at heap[0].
if int64(len(group.heap)) < k {
heap.Push(&group.heap, &Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
})
} else if group.heap[0].V < s.V || (math.IsNaN(group.heap[0].V) && !math.IsNaN(s.V)) {
} else if group.heap[0].F < s.F || (math.IsNaN(group.heap[0].F) && !math.IsNaN(s.F)) {
// This new element is bigger than the previous smallest element - overwrite that.
group.heap[0] = Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
}
if k > 1 {
@ -2528,13 +2653,13 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
// We build a heap of up to k elements, with the biggest element at heap[0].
if int64(len(group.reverseHeap)) < k {
heap.Push(&group.reverseHeap, &Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
})
} else if group.reverseHeap[0].V > s.V || (math.IsNaN(group.reverseHeap[0].V) && !math.IsNaN(s.V)) {
} else if group.reverseHeap[0].F > s.F || (math.IsNaN(group.reverseHeap[0].F) && !math.IsNaN(s.F)) {
// This new element is smaller than the previous biggest element - overwrite that.
group.reverseHeap[0] = Sample{
Point: Point{V: s.V},
F: s.F,
Metric: s.Metric,
}
if k > 1 {
@ -2554,16 +2679,16 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
for _, aggr := range orderedResult {
switch op {
case parser.AVG:
aggr.value = aggr.mean
aggr.floatValue = aggr.mean
case parser.COUNT, parser.COUNT_VALUES:
aggr.value = float64(aggr.groupCount)
aggr.floatValue = float64(aggr.groupCount)
case parser.STDVAR:
aggr.value = aggr.value / float64(aggr.groupCount)
aggr.floatValue = aggr.floatValue / float64(aggr.groupCount)
case parser.STDDEV:
aggr.value = math.Sqrt(aggr.value / float64(aggr.groupCount))
aggr.floatValue = math.Sqrt(aggr.floatValue / float64(aggr.groupCount))
case parser.TOPK:
// The heap keeps the lowest value on top, so reverse it.
@ -2573,7 +2698,7 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
for _, v := range aggr.heap {
enh.Out = append(enh.Out, Sample{
Metric: v.Metric,
Point: Point{V: v.V},
F: v.F,
})
}
continue // Bypass default append.
@ -2586,13 +2711,13 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
for _, v := range aggr.reverseHeap {
enh.Out = append(enh.Out, Sample{
Metric: v.Metric,
Point: Point{V: v.V},
F: v.F,
})
}
continue // Bypass default append.
case parser.QUANTILE:
aggr.value = quantile(q, aggr.heap)
aggr.floatValue = quantile(q, aggr.heap)
case parser.SUM:
if aggr.hasFloat && aggr.hasHistogram {
@ -2605,7 +2730,8 @@ func (ev *evaluator) aggregation(op parser.ItemType, grouping []string, without
enh.Out = append(enh.Out, Sample{
Metric: aggr.labels,
Point: Point{V: aggr.value, H: aggr.histogramValue},
F: aggr.floatValue,
H: aggr.histogramValue,
})
}
return enh.Out

141
promql/engine_test.go

@ -662,7 +662,8 @@ load 10s
Query: "metric",
Result: Vector{
Sample{
Point: Point{V: 1, T: 1000},
F: 1,
T: 1000,
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -672,7 +673,7 @@ load 10s
Query: "metric[20s]",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -683,7 +684,7 @@ load 10s
Query: "1",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 1000}, {V: 1, T: 2000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 1000}, {F: 1, T: 2000}},
Metric: labels.EmptyLabels(),
},
},
@ -695,7 +696,7 @@ load 10s
Query: "metric",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 1000}, {V: 1, T: 2000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 1000}, {F: 1, T: 2000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -707,7 +708,7 @@ load 10s
Query: "metric",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 5000}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 5000}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1462,20 +1463,20 @@ load 1ms
query: `metric_neg @ 0`,
start: 100,
result: Vector{
Sample{Point: Point{V: 1, T: 100000}, Metric: lblsneg},
Sample{F: 1, T: 100000, Metric: lblsneg},
},
}, {
query: `metric_neg @ -200`,
start: 100,
result: Vector{
Sample{Point: Point{V: 201, T: 100000}, Metric: lblsneg},
Sample{F: 201, T: 100000, Metric: lblsneg},
},
}, {
query: `metric{job="2"} @ 50`,
start: -2, end: 2, interval: 1,
result: Matrix{
Series{
Points: []Point{{V: 10, T: -2000}, {V: 10, T: -1000}, {V: 10, T: 0}, {V: 10, T: 1000}, {V: 10, T: 2000}},
Floats: []FPoint{{F: 10, T: -2000}, {F: 10, T: -1000}, {F: 10, T: 0}, {F: 10, T: 1000}, {F: 10, T: 2000}},
Metric: lbls2,
},
},
@ -1484,11 +1485,11 @@ load 1ms
start: 10,
result: Matrix{
Series{
Points: []Point{{V: 28, T: 280000}, {V: 29, T: 290000}, {V: 30, T: 300000}},
Floats: []FPoint{{F: 28, T: 280000}, {F: 29, T: 290000}, {F: 30, T: 300000}},
Metric: lbls1,
},
Series{
Points: []Point{{V: 56, T: 280000}, {V: 58, T: 290000}, {V: 60, T: 300000}},
Floats: []FPoint{{F: 56, T: 280000}, {F: 58, T: 290000}, {F: 60, T: 300000}},
Metric: lbls2,
},
},
@ -1497,7 +1498,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 3, T: -2000}, {V: 2, T: -1000}, {V: 1, T: 0}},
Floats: []FPoint{{F: 3, T: -2000}, {F: 2, T: -1000}, {F: 1, T: 0}},
Metric: lblsneg,
},
},
@ -1506,7 +1507,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 504, T: -503000}, {V: 503, T: -502000}, {V: 502, T: -501000}, {V: 501, T: -500000}},
Floats: []FPoint{{F: 504, T: -503000}, {F: 503, T: -502000}, {F: 502, T: -501000}, {F: 501, T: -500000}},
Metric: lblsneg,
},
},
@ -1515,7 +1516,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 2342, T: 2342}, {V: 2343, T: 2343}, {V: 2344, T: 2344}, {V: 2345, T: 2345}},
Floats: []FPoint{{F: 2342, T: 2342}, {F: 2343, T: 2343}, {F: 2344, T: 2344}, {F: 2345, T: 2345}},
Metric: lblsms,
},
},
@ -1524,11 +1525,11 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 20, T: 200000}, {V: 22, T: 225000}, {V: 25, T: 250000}, {V: 27, T: 275000}, {V: 30, T: 300000}},
Floats: []FPoint{{F: 20, T: 200000}, {F: 22, T: 225000}, {F: 25, T: 250000}, {F: 27, T: 275000}, {F: 30, T: 300000}},
Metric: lbls1,
},
Series{
Points: []Point{{V: 40, T: 200000}, {V: 44, T: 225000}, {V: 50, T: 250000}, {V: 54, T: 275000}, {V: 60, T: 300000}},
Floats: []FPoint{{F: 40, T: 200000}, {F: 44, T: 225000}, {F: 50, T: 250000}, {F: 54, T: 275000}, {F: 60, T: 300000}},
Metric: lbls2,
},
},
@ -1537,7 +1538,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 51, T: -50000}, {V: 26, T: -25000}, {V: 1, T: 0}},
Floats: []FPoint{{F: 51, T: -50000}, {F: 26, T: -25000}, {F: 1, T: 0}},
Metric: lblsneg,
},
},
@ -1546,7 +1547,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 151, T: -150000}, {V: 126, T: -125000}, {V: 101, T: -100000}},
Floats: []FPoint{{F: 151, T: -150000}, {F: 126, T: -125000}, {F: 101, T: -100000}},
Metric: lblsneg,
},
},
@ -1555,7 +1556,7 @@ load 1ms
start: 100,
result: Matrix{
Series{
Points: []Point{{V: 2250, T: 2250}, {V: 2275, T: 2275}, {V: 2300, T: 2300}, {V: 2325, T: 2325}},
Floats: []FPoint{{F: 2250, T: 2250}, {F: 2275, T: 2275}, {F: 2300, T: 2300}, {F: 2325, T: 2325}},
Metric: lblsms,
},
},
@ -1564,7 +1565,7 @@ load 1ms
start: 50, end: 80, interval: 10,
result: Matrix{
Series{
Points: []Point{{V: 995, T: 50000}, {V: 994, T: 60000}, {V: 993, T: 70000}, {V: 992, T: 80000}},
Floats: []FPoint{{F: 995, T: 50000}, {F: 994, T: 60000}, {F: 993, T: 70000}, {F: 992, T: 80000}},
Metric: lblstopk3,
},
},
@ -1573,7 +1574,7 @@ load 1ms
start: 50, end: 80, interval: 10,
result: Matrix{
Series{
Points: []Point{{V: 10, T: 50000}, {V: 12, T: 60000}, {V: 14, T: 70000}, {V: 16, T: 80000}},
Floats: []FPoint{{F: 10, T: 50000}, {F: 12, T: 60000}, {F: 14, T: 70000}, {F: 16, T: 80000}},
Metric: lblstopk2,
},
},
@ -1582,7 +1583,7 @@ load 1ms
start: 70, end: 100, interval: 10,
result: Matrix{
Series{
Points: []Point{{V: 993, T: 70000}, {V: 992, T: 80000}, {V: 991, T: 90000}, {V: 990, T: 100000}},
Floats: []FPoint{{F: 993, T: 70000}, {F: 992, T: 80000}, {F: 991, T: 90000}, {F: 990, T: 100000}},
Metric: lblstopk3,
},
},
@ -1591,7 +1592,7 @@ load 1ms
start: 100, end: 130, interval: 10,
result: Matrix{
Series{
Points: []Point{{V: 990, T: 100000}, {V: 989, T: 110000}, {V: 988, T: 120000}, {V: 987, T: 130000}},
Floats: []FPoint{{F: 990, T: 100000}, {F: 989, T: 110000}, {F: 988, T: 120000}, {F: 987, T: 130000}},
Metric: lblstopk3,
},
},
@ -1602,15 +1603,15 @@ load 1ms
start: 0, end: 7 * 60, interval: 60,
result: Matrix{
Series{
Points: []Point{
{V: 3600, T: 0},
{V: 3600, T: 60 * 1000},
{V: 3600, T: 2 * 60 * 1000},
{V: 3600, T: 3 * 60 * 1000},
{V: 3600, T: 4 * 60 * 1000},
{V: 3600, T: 5 * 60 * 1000},
{V: 3600, T: 6 * 60 * 1000},
{V: 3600, T: 7 * 60 * 1000},
Floats: []FPoint{
{F: 3600, T: 0},
{F: 3600, T: 60 * 1000},
{F: 3600, T: 2 * 60 * 1000},
{F: 3600, T: 3 * 60 * 1000},
{F: 3600, T: 4 * 60 * 1000},
{F: 3600, T: 5 * 60 * 1000},
{F: 3600, T: 6 * 60 * 1000},
{F: 3600, T: 7 * 60 * 1000},
},
Metric: labels.EmptyLabels(),
},
@ -1723,7 +1724,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1737,7 +1738,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 5000}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 5000}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1751,7 +1752,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 5000}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 5000}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1765,7 +1766,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 1, T: 5000}, {V: 2, T: 10000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 1, T: 5000}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1779,7 +1780,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 2, T: 15000}, {V: 2, T: 20000}, {V: 2, T: 25000}, {V: 2, T: 30000}},
Floats: []FPoint{{F: 2, T: 15000}, {F: 2, T: 20000}, {F: 2, T: 25000}, {F: 2, T: 30000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1793,7 +1794,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 2, T: 10000}, {V: 2, T: 15000}, {V: 2, T: 20000}, {V: 2, T: 25000}, {V: 2, T: 30000}},
Floats: []FPoint{{F: 2, T: 10000}, {F: 2, T: 15000}, {F: 2, T: 20000}, {F: 2, T: 25000}, {F: 2, T: 30000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1807,7 +1808,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 2, T: 10000}, {V: 2, T: 15000}, {V: 2, T: 20000}, {V: 2, T: 25000}},
Floats: []FPoint{{F: 2, T: 10000}, {F: 2, T: 15000}, {F: 2, T: 20000}, {F: 2, T: 25000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1821,7 +1822,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 2, T: 10000}, {V: 2, T: 15000}, {V: 2, T: 20000}, {V: 2, T: 25000}},
Floats: []FPoint{{F: 2, T: 10000}, {F: 2, T: 15000}, {F: 2, T: 20000}, {F: 2, T: 25000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -1844,7 +1845,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 9990, T: 9990000}, {V: 10000, T: 10000000}, {V: 100, T: 10010000}, {V: 130, T: 10020000}},
Floats: []FPoint{{F: 9990, T: 9990000}, {F: 10000, T: 10000000}, {F: 100, T: 10010000}, {F: 130, T: 10020000}},
Metric: labels.FromStrings("__name__", "http_requests", "job", "api-server", "instance", "0", "group", "production"),
},
},
@ -1858,7 +1859,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 9840, T: 9840000}, {V: 9900, T: 9900000}, {V: 9960, T: 9960000}, {V: 130, T: 10020000}, {V: 310, T: 10080000}},
Floats: []FPoint{{F: 9840, T: 9840000}, {F: 9900, T: 9900000}, {F: 9960, T: 9960000}, {F: 130, T: 10020000}, {F: 310, T: 10080000}},
Metric: labels.FromStrings("__name__", "http_requests", "job", "api-server", "instance", "0", "group", "production"),
},
},
@ -1872,7 +1873,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 8640, T: 8640000}, {V: 8700, T: 8700000}, {V: 8760, T: 8760000}, {V: 8820, T: 8820000}, {V: 8880, T: 8880000}},
Floats: []FPoint{{F: 8640, T: 8640000}, {F: 8700, T: 8700000}, {F: 8760, T: 8760000}, {F: 8820, T: 8820000}, {F: 8880, T: 8880000}},
Metric: labels.FromStrings("__name__", "http_requests", "job", "api-server", "instance", "0", "group", "production"),
},
},
@ -1886,19 +1887,19 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 3, T: 7985000}, {V: 3, T: 7990000}, {V: 3, T: 7995000}, {V: 3, T: 8000000}},
Floats: []FPoint{{F: 3, T: 7985000}, {F: 3, T: 7990000}, {F: 3, T: 7995000}, {F: 3, T: 8000000}},
Metric: labels.FromStrings("job", "api-server", "instance", "0", "group", "canary"),
},
Series{
Points: []Point{{V: 4, T: 7985000}, {V: 4, T: 7990000}, {V: 4, T: 7995000}, {V: 4, T: 8000000}},
Floats: []FPoint{{F: 4, T: 7985000}, {F: 4, T: 7990000}, {F: 4, T: 7995000}, {F: 4, T: 8000000}},
Metric: labels.FromStrings("job", "api-server", "instance", "1", "group", "canary"),
},
Series{
Points: []Point{{V: 1, T: 7985000}, {V: 1, T: 7990000}, {V: 1, T: 7995000}, {V: 1, T: 8000000}},
Floats: []FPoint{{F: 1, T: 7985000}, {F: 1, T: 7990000}, {F: 1, T: 7995000}, {F: 1, T: 8000000}},
Metric: labels.FromStrings("job", "api-server", "instance", "0", "group", "production"),
},
Series{
Points: []Point{{V: 2, T: 7985000}, {V: 2, T: 7990000}, {V: 2, T: 7995000}, {V: 2, T: 8000000}},
Floats: []FPoint{{F: 2, T: 7985000}, {F: 2, T: 7990000}, {F: 2, T: 7995000}, {F: 2, T: 8000000}},
Metric: labels.FromStrings("job", "api-server", "instance", "1", "group", "production"),
},
},
@ -1912,7 +1913,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 270, T: 90000}, {V: 300, T: 100000}, {V: 330, T: 110000}, {V: 360, T: 120000}},
Floats: []FPoint{{F: 270, T: 90000}, {F: 300, T: 100000}, {F: 330, T: 110000}, {F: 360, T: 120000}},
Metric: labels.EmptyLabels(),
},
},
@ -1926,7 +1927,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 800, T: 80000}, {V: 900, T: 90000}, {V: 1000, T: 100000}, {V: 1100, T: 110000}, {V: 1200, T: 120000}},
Floats: []FPoint{{F: 800, T: 80000}, {F: 900, T: 90000}, {F: 1000, T: 100000}, {F: 1100, T: 110000}, {F: 1200, T: 120000}},
Metric: labels.EmptyLabels(),
},
},
@ -1940,7 +1941,7 @@ func TestSubquerySelector(t *testing.T) {
nil,
Matrix{
Series{
Points: []Point{{V: 1000, T: 100000}, {V: 1000, T: 105000}, {V: 1100, T: 110000}, {V: 1100, T: 115000}, {V: 1200, T: 120000}},
Floats: []FPoint{{F: 1000, T: 100000}, {F: 1000, T: 105000}, {F: 1100, T: 110000}, {F: 1100, T: 115000}, {F: 1200, T: 120000}},
Metric: labels.EmptyLabels(),
},
},
@ -2996,7 +2997,7 @@ func TestRangeQuery(t *testing.T) {
Query: "sum_over_time(bar[30s])",
Result: Matrix{
Series{
Points: []Point{{V: 0, T: 0}, {V: 11, T: 60000}, {V: 1100, T: 120000}},
Floats: []FPoint{{F: 0, T: 0}, {F: 11, T: 60000}, {F: 1100, T: 120000}},
Metric: labels.EmptyLabels(),
},
},
@ -3011,7 +3012,7 @@ func TestRangeQuery(t *testing.T) {
Query: "sum_over_time(bar[30s])",
Result: Matrix{
Series{
Points: []Point{{V: 0, T: 0}, {V: 11, T: 60000}, {V: 1100, T: 120000}},
Floats: []FPoint{{F: 0, T: 0}, {F: 11, T: 60000}, {F: 1100, T: 120000}},
Metric: labels.EmptyLabels(),
},
},
@ -3026,7 +3027,7 @@ func TestRangeQuery(t *testing.T) {
Query: "sum_over_time(bar[30s])",
Result: Matrix{
Series{
Points: []Point{{V: 0, T: 0}, {V: 11, T: 60000}, {V: 1100, T: 120000}, {V: 110000, T: 180000}, {V: 11000000, T: 240000}},
Floats: []FPoint{{F: 0, T: 0}, {F: 11, T: 60000}, {F: 1100, T: 120000}, {F: 110000, T: 180000}, {F: 11000000, T: 240000}},
Metric: labels.EmptyLabels(),
},
},
@ -3041,7 +3042,7 @@ func TestRangeQuery(t *testing.T) {
Query: "sum_over_time(bar[30s])",
Result: Matrix{
Series{
Points: []Point{{V: 5, T: 0}, {V: 59, T: 60000}, {V: 9, T: 120000}, {V: 956, T: 180000}},
Floats: []FPoint{{F: 5, T: 0}, {F: 59, T: 60000}, {F: 9, T: 120000}, {F: 956, T: 180000}},
Metric: labels.EmptyLabels(),
},
},
@ -3056,7 +3057,7 @@ func TestRangeQuery(t *testing.T) {
Query: "metric",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 3, T: 60000}, {V: 5, T: 120000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 3, T: 60000}, {F: 5, T: 120000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -3071,7 +3072,7 @@ func TestRangeQuery(t *testing.T) {
Query: "metric",
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 3, T: 60000}, {V: 5, T: 120000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 3, T: 60000}, {F: 5, T: 120000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
@ -3087,14 +3088,14 @@ func TestRangeQuery(t *testing.T) {
Query: `foo > 2 or bar`,
Result: Matrix{
Series{
Points: []Point{{V: 1, T: 0}, {V: 3, T: 60000}, {V: 5, T: 120000}},
Floats: []FPoint{{F: 1, T: 0}, {F: 3, T: 60000}, {F: 5, T: 120000}},
Metric: labels.FromStrings(
"__name__", "bar",
"job", "2",
),
},
Series{
Points: []Point{{V: 3, T: 60000}, {V: 5, T: 120000}},
Floats: []FPoint{{F: 3, T: 60000}, {F: 5, T: 120000}},
Metric: labels.FromStrings(
"__name__", "foo",
"job", "1",
@ -3266,9 +3267,9 @@ func TestNativeHistogram_HistogramCountAndSum(t *testing.T) {
require.Len(t, vector, 1)
require.Nil(t, vector[0].H)
if floatHisto {
require.Equal(t, float64(h.ToFloat().Count), vector[0].V)
require.Equal(t, float64(h.ToFloat().Count), vector[0].F)
} else {
require.Equal(t, float64(h.Count), vector[0].V)
require.Equal(t, float64(h.Count), vector[0].F)
}
queryString = fmt.Sprintf("histogram_sum(%s)", seriesName)
@ -3284,9 +3285,9 @@ func TestNativeHistogram_HistogramCountAndSum(t *testing.T) {
require.Len(t, vector, 1)
require.Nil(t, vector[0].H)
if floatHisto {
require.Equal(t, h.ToFloat().Sum, vector[0].V)
require.Equal(t, h.ToFloat().Sum, vector[0].F)
} else {
require.Equal(t, h.Sum, vector[0].V)
require.Equal(t, h.Sum, vector[0].F)
}
})
}
@ -3519,7 +3520,7 @@ func TestNativeHistogram_HistogramQuantile(t *testing.T) {
require.Len(t, vector, 1)
require.Nil(t, vector[0].H)
require.True(t, almostEqual(sc.value, vector[0].V))
require.True(t, almostEqual(sc.value, vector[0].F))
})
}
idx++
@ -3951,10 +3952,10 @@ func TestNativeHistogram_HistogramFraction(t *testing.T) {
require.Len(t, vector, 1)
require.Nil(t, vector[0].H)
if math.IsNaN(sc.value) {
require.True(t, math.IsNaN(vector[0].V))
require.True(t, math.IsNaN(vector[0].F))
return
}
require.Equal(t, sc.value, vector[0].V)
require.Equal(t, sc.value, vector[0].F)
})
}
idx++
@ -4090,24 +4091,18 @@ func TestNativeHistogram_Sum_Count_AddOperator(t *testing.T) {
// sum().
queryString := fmt.Sprintf("sum(%s)", seriesName)
queryAndCheck(queryString, []Sample{
{Point{T: ts, H: &c.expected}, labels.EmptyLabels()},
})
queryAndCheck(queryString, []Sample{{T: ts, H: &c.expected, Metric: labels.EmptyLabels()}})
// + operator.
queryString = fmt.Sprintf(`%s{idx="0"}`, seriesName)
for idx := 1; idx < len(c.histograms); idx++ {
queryString += fmt.Sprintf(` + ignoring(idx) %s{idx="%d"}`, seriesName, idx)
}
queryAndCheck(queryString, []Sample{
{Point{T: ts, H: &c.expected}, labels.EmptyLabels()},
})
queryAndCheck(queryString, []Sample{{T: ts, H: &c.expected, Metric: labels.EmptyLabels()}})
// count().
queryString = fmt.Sprintf("count(%s)", seriesName)
queryAndCheck(queryString, []Sample{
{Point{T: ts, V: 3}, labels.EmptyLabels()},
})
queryAndCheck(queryString, []Sample{{T: ts, F: 3, Metric: labels.EmptyLabels()}})
})
idx0++
}

473
promql/functions.go

@ -54,9 +54,9 @@ type FunctionCall func(vals []parser.Value, args parser.Expressions, enh *EvalNo
// === time() float64 ===
func funcTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return Vector{Sample{Point: Point{
V: float64(enh.Ts) / 1000,
}}}
return Vector{Sample{
F: float64(enh.Ts) / 1000,
}}
}
// extrapolatedRate is a utility function for rate/increase/delta.
@ -67,65 +67,71 @@ func extrapolatedRate(vals []parser.Value, args parser.Expressions, enh *EvalNod
ms := args[0].(*parser.MatrixSelector)
vs := ms.VectorSelector.(*parser.VectorSelector)
var (
samples = vals[0].(Matrix)[0]
rangeStart = enh.Ts - durationMilliseconds(ms.Range+vs.Offset)
rangeEnd = enh.Ts - durationMilliseconds(vs.Offset)
resultValue float64
resultHistogram *histogram.FloatHistogram
samples = vals[0].(Matrix)[0]
rangeStart = enh.Ts - durationMilliseconds(ms.Range+vs.Offset)
rangeEnd = enh.Ts - durationMilliseconds(vs.Offset)
resultValue float64
resultHistogram *histogram.FloatHistogram
firstT, lastT int64
numSamplesMinusOne int
)
// No sense in trying to compute a rate without at least two points. Drop
// this Vector element.
if len(samples.Points) < 2 {
// We need either at least two Histograms and no Floats, or at least two
// Floats and no Histograms to calculate a rate. Otherwise, drop this
// Vector element.
if len(samples.Histograms) > 0 && len(samples.Floats) > 0 {
// Mix of histograms and floats. TODO(beorn7): Communicate this failure reason.
return enh.Out
}
if samples.Points[0].H != nil {
resultHistogram = histogramRate(samples.Points, isCounter)
switch {
case len(samples.Histograms) > 1:
numSamplesMinusOne = len(samples.Histograms) - 1
firstT = samples.Histograms[0].T
lastT = samples.Histograms[numSamplesMinusOne].T
resultHistogram = histogramRate(samples.Histograms, isCounter)
if resultHistogram == nil {
// Points are a mix of floats and histograms, or the histograms
// are not compatible with each other.
// TODO(beorn7): find a way of communicating the exact reason
// The histograms are not compatible with each other.
// TODO(beorn7): Communicate this failure reason.
return enh.Out
}
} else {
resultValue = samples.Points[len(samples.Points)-1].V - samples.Points[0].V
prevValue := samples.Points[0].V
// We have to iterate through everything even in the non-counter
// case because we have to check that everything is a float.
// TODO(beorn7): Find a way to check that earlier, e.g. by
// handing in a []FloatPoint and a []HistogramPoint separately.
for _, currPoint := range samples.Points[1:] {
if currPoint.H != nil {
return nil // Range contains a mix of histograms and floats.
}
if !isCounter {
continue
}
if currPoint.V < prevValue {
case len(samples.Floats) > 1:
numSamplesMinusOne = len(samples.Floats) - 1
firstT = samples.Floats[0].T
lastT = samples.Floats[numSamplesMinusOne].T
resultValue = samples.Floats[numSamplesMinusOne].F - samples.Floats[0].F
if !isCounter {
break
}
// Handle counter resets:
prevValue := samples.Floats[0].F
for _, currPoint := range samples.Floats[1:] {
if currPoint.F < prevValue {
resultValue += prevValue
}
prevValue = currPoint.V
prevValue = currPoint.F
}
default:
// Not enough samples. TODO(beorn7): Communicate this failure reason.
return enh.Out
}
// Duration between first/last samples and boundary of range.
durationToStart := float64(samples.Points[0].T-rangeStart) / 1000
durationToEnd := float64(rangeEnd-samples.Points[len(samples.Points)-1].T) / 1000
durationToStart := float64(firstT-rangeStart) / 1000
durationToEnd := float64(rangeEnd-lastT) / 1000
sampledInterval := float64(samples.Points[len(samples.Points)-1].T-samples.Points[0].T) / 1000
averageDurationBetweenSamples := sampledInterval / float64(len(samples.Points)-1)
sampledInterval := float64(lastT-firstT) / 1000
averageDurationBetweenSamples := sampledInterval / float64(numSamplesMinusOne)
// TODO(beorn7): Do this for histograms, too.
if isCounter && resultValue > 0 && samples.Points[0].V >= 0 {
// Counters cannot be negative. If we have any slope at
// all (i.e. resultValue went up), we can extrapolate
// the zero point of the counter. If the duration to the
// zero point is shorter than the durationToStart, we
// take the zero point as the start of the series,
// thereby avoiding extrapolation to negative counter
// values.
durationToZero := sampledInterval * (samples.Points[0].V / resultValue)
if isCounter && resultValue > 0 && len(samples.Floats) > 0 && samples.Floats[0].F >= 0 {
// Counters cannot be negative. If we have any slope at all
// (i.e. resultValue went up), we can extrapolate the zero point
// of the counter. If the duration to the zero point is shorter
// than the durationToStart, we take the zero point as the start
// of the series, thereby avoiding extrapolation to negative
// counter values.
durationToZero := sampledInterval * (samples.Floats[0].F / resultValue)
if durationToZero < durationToStart {
durationToStart = durationToZero
}
@ -158,16 +164,14 @@ func extrapolatedRate(vals []parser.Value, args parser.Expressions, enh *EvalNod
resultHistogram.Scale(factor)
}
return append(enh.Out, Sample{
Point: Point{V: resultValue, H: resultHistogram},
})
return append(enh.Out, Sample{F: resultValue, H: resultHistogram})
}
// histogramRate is a helper function for extrapolatedRate. It requires
// points[0] to be a histogram. It returns nil if any other Point in points is
// not a histogram.
func histogramRate(points []Point, isCounter bool) *histogram.FloatHistogram {
prev := points[0].H // We already know that this is a histogram.
func histogramRate(points []HPoint, isCounter bool) *histogram.FloatHistogram {
prev := points[0].H
last := points[len(points)-1].H
if last == nil {
return nil // Range contains a mix of histograms and floats.
@ -243,19 +247,19 @@ func instantValue(vals []parser.Value, out Vector, isRate bool) Vector {
samples := vals[0].(Matrix)[0]
// No sense in trying to compute a rate without at least two points. Drop
// this Vector element.
if len(samples.Points) < 2 {
if len(samples.Floats) < 2 {
return out
}
lastSample := samples.Points[len(samples.Points)-1]
previousSample := samples.Points[len(samples.Points)-2]
lastSample := samples.Floats[len(samples.Floats)-1]
previousSample := samples.Floats[len(samples.Floats)-2]
var resultValue float64
if isRate && lastSample.V < previousSample.V {
if isRate && lastSample.F < previousSample.F {
// Counter reset.
resultValue = lastSample.V
resultValue = lastSample.F
} else {
resultValue = lastSample.V - previousSample.V
resultValue = lastSample.F - previousSample.F
}
sampledInterval := lastSample.T - previousSample.T
@ -269,9 +273,7 @@ func instantValue(vals []parser.Value, out Vector, isRate bool) Vector {
resultValue /= float64(sampledInterval) / 1000
}
return append(out, Sample{
Point: Point{V: resultValue},
})
return append(out, Sample{F: resultValue})
}
// Calculate the trend value at the given index i in raw data d.
@ -300,10 +302,10 @@ func funcHoltWinters(vals []parser.Value, args parser.Expressions, enh *EvalNode
samples := vals[0].(Matrix)[0]
// The smoothing factor argument.
sf := vals[1].(Vector)[0].V
sf := vals[1].(Vector)[0].F
// The trend factor argument.
tf := vals[2].(Vector)[0].V
tf := vals[2].(Vector)[0].F
// Check that the input parameters are valid.
if sf <= 0 || sf >= 1 {
@ -313,7 +315,7 @@ func funcHoltWinters(vals []parser.Value, args parser.Expressions, enh *EvalNode
panic(fmt.Errorf("invalid trend factor. Expected: 0 < tf < 1, got: %f", tf))
}
l := len(samples.Points)
l := len(samples.Floats)
// Can't do the smoothing operation with less than two points.
if l < 2 {
@ -322,15 +324,15 @@ func funcHoltWinters(vals []parser.Value, args parser.Expressions, enh *EvalNode
var s0, s1, b float64
// Set initial values.
s1 = samples.Points[0].V
b = samples.Points[1].V - samples.Points[0].V
s1 = samples.Floats[0].F
b = samples.Floats[1].F - samples.Floats[0].F
// Run the smoothing operation.
var x, y float64
for i := 1; i < l; i++ {
// Scale the raw value against the smoothing factor.
x = sf * samples.Points[i].V
x = sf * samples.Floats[i].F
// Scale the last smoothed value with the trend at this point.
b = calcTrendValue(i-1, tf, s0, s1, b)
@ -339,9 +341,7 @@ func funcHoltWinters(vals []parser.Value, args parser.Expressions, enh *EvalNode
s0, s1 = s1, x+y
}
return append(enh.Out, Sample{
Point: Point{V: s1},
})
return append(enh.Out, Sample{F: s1})
}
// === sort(node parser.ValueTypeVector) Vector ===
@ -365,15 +365,15 @@ func funcSortDesc(vals []parser.Value, args parser.Expressions, enh *EvalNodeHel
// === clamp(Vector parser.ValueTypeVector, min, max Scalar) Vector ===
func funcClamp(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
vec := vals[0].(Vector)
min := vals[1].(Vector)[0].Point.V
max := vals[2].(Vector)[0].Point.V
min := vals[1].(Vector)[0].F
max := vals[2].(Vector)[0].F
if max < min {
return enh.Out
}
for _, el := range vec {
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: math.Max(min, math.Min(max, el.V))},
F: math.Max(min, math.Min(max, el.F)),
})
}
return enh.Out
@ -382,11 +382,11 @@ func funcClamp(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper
// === clamp_max(Vector parser.ValueTypeVector, max Scalar) Vector ===
func funcClampMax(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
vec := vals[0].(Vector)
max := vals[1].(Vector)[0].Point.V
max := vals[1].(Vector)[0].F
for _, el := range vec {
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: math.Min(max, el.V)},
F: math.Min(max, el.F),
})
}
return enh.Out
@ -395,11 +395,11 @@ func funcClampMax(vals []parser.Value, args parser.Expressions, enh *EvalNodeHel
// === clamp_min(Vector parser.ValueTypeVector, min Scalar) Vector ===
func funcClampMin(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
vec := vals[0].(Vector)
min := vals[1].(Vector)[0].Point.V
min := vals[1].(Vector)[0].F
for _, el := range vec {
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: math.Max(min, el.V)},
F: math.Max(min, el.F),
})
}
return enh.Out
@ -412,16 +412,16 @@ func funcRound(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper
// Ties are solved by rounding up.
toNearest := float64(1)
if len(args) >= 2 {
toNearest = vals[1].(Vector)[0].Point.V
toNearest = vals[1].(Vector)[0].F
}
// Invert as it seems to cause fewer floating point accuracy issues.
toNearestInverse := 1.0 / toNearest
for _, el := range vec {
v := math.Floor(el.V*toNearestInverse+0.5) / toNearestInverse
v := math.Floor(el.F*toNearestInverse+0.5) / toNearestInverse
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: v},
F: v,
})
}
return enh.Out
@ -431,37 +431,38 @@ func funcRound(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper
func funcScalar(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
v := vals[0].(Vector)
if len(v) != 1 {
return append(enh.Out, Sample{
Point: Point{V: math.NaN()},
})
return append(enh.Out, Sample{F: math.NaN()})
}
return append(enh.Out, Sample{
Point: Point{V: v[0].V},
})
return append(enh.Out, Sample{F: v[0].F})
}
func aggrOverTime(vals []parser.Value, enh *EvalNodeHelper, aggrFn func([]Point) float64) Vector {
func aggrOverTime(vals []parser.Value, enh *EvalNodeHelper, aggrFn func(Series) float64) Vector {
el := vals[0].(Matrix)[0]
return append(enh.Out, Sample{
Point: Point{V: aggrFn(el.Points)},
})
return append(enh.Out, Sample{F: aggrFn(el)})
}
// === avg_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcAvgOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. avg_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
var mean, count, c float64
for _, v := range values {
for _, f := range s.Floats {
count++
if math.IsInf(mean, 0) {
if math.IsInf(v.V, 0) && (mean > 0) == (v.V > 0) {
// The `mean` and `v.V` values are `Inf` of the same sign. They
if math.IsInf(f.F, 0) && (mean > 0) == (f.F > 0) {
// The `mean` and `f.F` values are `Inf` of the same sign. They
// can't be subtracted, but the value of `mean` is correct
// already.
continue
}
if !math.IsInf(v.V, 0) && !math.IsNaN(v.V) {
if !math.IsInf(f.F, 0) && !math.IsNaN(f.F) {
// At this stage, the mean is an infinite. If the added
// value is neither an Inf or a Nan, we can keep that mean
// value.
@ -471,7 +472,7 @@ func funcAvgOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNode
continue
}
}
mean, c = kahanSumInc(v.V/count-mean/count, mean, c)
mean, c = kahanSumInc(f.F/count-mean/count, mean, c)
}
if math.IsInf(mean, 0) {
@ -483,8 +484,8 @@ func funcAvgOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNode
// === count_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcCountOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
return float64(len(values))
return aggrOverTime(vals, enh, func(s Series) float64 {
return float64(len(s.Floats) + len(s.Histograms))
})
}
@ -492,19 +493,42 @@ func funcCountOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNo
func funcLastOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
el := vals[0].(Matrix)[0]
var f FPoint
if len(el.Floats) > 0 {
f = el.Floats[len(el.Floats)-1]
}
var h HPoint
if len(el.Histograms) > 0 {
h = el.Histograms[len(el.Histograms)-1]
}
if h.H == nil || h.T < f.T {
return append(enh.Out, Sample{
Metric: el.Metric,
F: f.F,
})
}
return append(enh.Out, Sample{
Metric: el.Metric,
Point: Point{V: el.Points[len(el.Points)-1].V},
H: h.H,
})
}
// === max_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcMaxOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
max := values[0].V
for _, v := range values {
if v.V > max || math.IsNaN(max) {
max = v.V
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. max_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
max := s.Floats[0].F
for _, f := range s.Floats {
if f.F > max || math.IsNaN(max) {
max = f.F
}
}
return max
@ -513,11 +537,18 @@ func funcMaxOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNode
// === min_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcMinOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
min := values[0].V
for _, v := range values {
if v.V < min || math.IsNaN(min) {
min = v.V
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. min_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
min := s.Floats[0].F
for _, f := range s.Floats {
if f.F < min || math.IsNaN(min) {
min = f.F
}
}
return min
@ -526,10 +557,17 @@ func funcMinOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNode
// === sum_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcSumOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. sum_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
var sum, c float64
for _, v := range values {
sum, c = kahanSumInc(v.V, sum, c)
for _, f := range s.Floats {
sum, c = kahanSumInc(f.F, sum, c)
}
if math.IsInf(sum, 0) {
return sum
@ -540,29 +578,41 @@ func funcSumOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNode
// === quantile_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcQuantileOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
q := vals[0].(Vector)[0].V
q := vals[0].(Vector)[0].F
el := vals[1].(Matrix)[0]
if len(el.Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. quantile_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
values := make(vectorByValueHeap, 0, len(el.Points))
for _, v := range el.Points {
values = append(values, Sample{Point: Point{V: v.V}})
values := make(vectorByValueHeap, 0, len(el.Floats))
for _, f := range el.Floats {
values = append(values, Sample{F: f.F})
}
return append(enh.Out, Sample{
Point: Point{V: quantile(q, values)},
})
return append(enh.Out, Sample{F: quantile(q, values)})
}
// === stddev_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcStddevOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. stddev_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
var count float64
var mean, cMean float64
var aux, cAux float64
for _, v := range values {
for _, f := range s.Floats {
count++
delta := v.V - (mean + cMean)
delta := f.F - (mean + cMean)
mean, cMean = kahanSumInc(delta/count, mean, cMean)
aux, cAux = kahanSumInc(delta*(v.V-(mean+cMean)), aux, cAux)
aux, cAux = kahanSumInc(delta*(f.F-(mean+cMean)), aux, cAux)
}
return math.Sqrt((aux + cAux) / count)
})
@ -570,15 +620,22 @@ func funcStddevOverTime(vals []parser.Value, args parser.Expressions, enh *EvalN
// === stdvar_over_time(Matrix parser.ValueTypeMatrix) Vector ===
func funcStdvarOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
if len(vals[0].(Matrix)[0].Floats) == 0 {
// TODO(beorn7): The passed values only contain
// histograms. stdvar_over_time ignores histograms for now. If
// there are only histograms, we have to return without adding
// anything to enh.Out.
return enh.Out
}
return aggrOverTime(vals, enh, func(s Series) float64 {
var count float64
var mean, cMean float64
var aux, cAux float64
for _, v := range values {
for _, f := range s.Floats {
count++
delta := v.V - (mean + cMean)
delta := f.F - (mean + cMean)
mean, cMean = kahanSumInc(delta/count, mean, cMean)
aux, cAux = kahanSumInc(delta*(v.V-(mean+cMean)), aux, cAux)
aux, cAux = kahanSumInc(delta*(f.F-(mean+cMean)), aux, cAux)
}
return (aux + cAux) / count
})
@ -592,7 +649,7 @@ func funcAbsent(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelpe
return append(enh.Out,
Sample{
Metric: createLabelsForAbsentFunction(args[0]),
Point: Point{V: 1},
F: 1,
})
}
@ -602,25 +659,24 @@ func funcAbsent(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelpe
// Due to engine optimization, this function is only called when this condition is true.
// Then, the engine post-processes the results to get the expected output.
func funcAbsentOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return append(enh.Out,
Sample{
Point: Point{V: 1},
})
return append(enh.Out, Sample{F: 1})
}
// === present_over_time(Vector parser.ValueTypeMatrix) Vector ===
func funcPresentOverTime(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return aggrOverTime(vals, enh, func(values []Point) float64 {
return aggrOverTime(vals, enh, func(s Series) float64 {
return 1
})
}
func simpleFunc(vals []parser.Value, enh *EvalNodeHelper, f func(float64) float64) Vector {
for _, el := range vals[0].(Vector) {
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: f(el.V)},
})
if el.H == nil { // Process only float samples.
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
F: f(el.F),
})
}
}
return enh.Out
}
@ -741,9 +797,7 @@ func funcDeg(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper)
// === pi() Scalar ===
func funcPi(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
return Vector{Sample{Point: Point{
V: math.Pi,
}}}
return Vector{Sample{F: math.Pi}}
}
// === sgn(Vector parser.ValueTypeVector) Vector ===
@ -764,7 +818,7 @@ func funcTimestamp(vals []parser.Value, args parser.Expressions, enh *EvalNodeHe
for _, el := range vec {
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: float64(el.T) / 1000},
F: float64(el.T) / 1000,
})
}
return enh.Out
@ -793,7 +847,7 @@ func kahanSumInc(inc, sum, c float64) (newSum, newC float64) {
// linearRegression performs a least-square linear regression analysis on the
// provided SamplePairs. It returns the slope, and the intercept value at the
// provided time.
func linearRegression(samples []Point, interceptTime int64) (slope, intercept float64) {
func linearRegression(samples []FPoint, interceptTime int64) (slope, intercept float64) {
var (
n float64
sumX, cX float64
@ -803,18 +857,18 @@ func linearRegression(samples []Point, interceptTime int64) (slope, intercept fl
initY float64
constY bool
)
initY = samples[0].V
initY = samples[0].F
constY = true
for i, sample := range samples {
// Set constY to false if any new y values are encountered.
if constY && i > 0 && sample.V != initY {
if constY && i > 0 && sample.F != initY {
constY = false
}
n += 1.0
x := float64(sample.T-interceptTime) / 1e3
sumX, cX = kahanSumInc(x, sumX, cX)
sumY, cY = kahanSumInc(sample.V, sumY, cY)
sumXY, cXY = kahanSumInc(x*sample.V, sumXY, cXY)
sumY, cY = kahanSumInc(sample.F, sumY, cY)
sumXY, cXY = kahanSumInc(x*sample.F, sumXY, cXY)
sumX2, cX2 = kahanSumInc(x*x, sumX2, cX2)
}
if constY {
@ -842,33 +896,29 @@ func funcDeriv(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper
// No sense in trying to compute a derivative without at least two points.
// Drop this Vector element.
if len(samples.Points) < 2 {
if len(samples.Floats) < 2 {
return enh.Out
}
// We pass in an arbitrary timestamp that is near the values in use
// to avoid floating point accuracy issues, see
// https://github.com/prometheus/prometheus/issues/2674
slope, _ := linearRegression(samples.Points, samples.Points[0].T)
return append(enh.Out, Sample{
Point: Point{V: slope},
})
slope, _ := linearRegression(samples.Floats, samples.Floats[0].T)
return append(enh.Out, Sample{F: slope})
}
// === predict_linear(node parser.ValueTypeMatrix, k parser.ValueTypeScalar) Vector ===
func funcPredictLinear(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
samples := vals[0].(Matrix)[0]
duration := vals[1].(Vector)[0].V
duration := vals[1].(Vector)[0].F
// No sense in trying to predict anything without at least two points.
// Drop this Vector element.
if len(samples.Points) < 2 {
if len(samples.Floats) < 2 {
return enh.Out
}
slope, intercept := linearRegression(samples.Points, enh.Ts)
slope, intercept := linearRegression(samples.Floats, enh.Ts)
return append(enh.Out, Sample{
Point: Point{V: slope*duration + intercept},
})
return append(enh.Out, Sample{F: slope*duration + intercept})
}
// === histogram_count(Vector parser.ValueTypeVector) Vector ===
@ -882,7 +932,7 @@ func funcHistogramCount(vals []parser.Value, args parser.Expressions, enh *EvalN
}
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(sample.Metric),
Point: Point{V: sample.H.Count},
F: sample.H.Count,
})
}
return enh.Out
@ -899,7 +949,7 @@ func funcHistogramSum(vals []parser.Value, args parser.Expressions, enh *EvalNod
}
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(sample.Metric),
Point: Point{V: sample.H.Sum},
F: sample.H.Sum,
})
}
return enh.Out
@ -907,8 +957,8 @@ func funcHistogramSum(vals []parser.Value, args parser.Expressions, enh *EvalNod
// === histogram_fraction(lower, upper parser.ValueTypeScalar, Vector parser.ValueTypeVector) Vector ===
func funcHistogramFraction(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
lower := vals[0].(Vector)[0].V
upper := vals[1].(Vector)[0].V
lower := vals[0].(Vector)[0].F
upper := vals[1].(Vector)[0].F
inVec := vals[2].(Vector)
for _, sample := range inVec {
@ -918,7 +968,7 @@ func funcHistogramFraction(vals []parser.Value, args parser.Expressions, enh *Ev
}
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(sample.Metric),
Point: Point{V: histogramFraction(lower, upper, sample.H)},
F: histogramFraction(lower, upper, sample.H),
})
}
return enh.Out
@ -926,7 +976,7 @@ func funcHistogramFraction(vals []parser.Value, args parser.Expressions, enh *Ev
// === histogram_quantile(k parser.ValueTypeScalar, Vector parser.ValueTypeVector) Vector ===
func funcHistogramQuantile(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
q := vals[0].(Vector)[0].V
q := vals[0].(Vector)[0].F
inVec := vals[1].(Vector)
if enh.signatureToMetricWithBuckets == nil {
@ -965,7 +1015,7 @@ func funcHistogramQuantile(vals []parser.Value, args parser.Expressions, enh *Ev
mb = &metricWithBuckets{sample.Metric, nil}
enh.signatureToMetricWithBuckets[string(enh.lblBuf)] = mb
}
mb.buckets = append(mb.buckets, bucket{upperBound, sample.V})
mb.buckets = append(mb.buckets, bucket{upperBound, sample.F})
}
@ -985,7 +1035,7 @@ func funcHistogramQuantile(vals []parser.Value, args parser.Expressions, enh *Ev
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(sample.Metric),
Point: Point{V: histogramQuantile(q, sample.H)},
F: histogramQuantile(q, sample.H),
})
}
@ -993,7 +1043,7 @@ func funcHistogramQuantile(vals []parser.Value, args parser.Expressions, enh *Ev
if len(mb.buckets) > 0 {
enh.Out = append(enh.Out, Sample{
Metric: mb.metric,
Point: Point{V: bucketQuantile(q, mb.buckets)},
F: bucketQuantile(q, mb.buckets),
})
}
}
@ -1003,40 +1053,55 @@ func funcHistogramQuantile(vals []parser.Value, args parser.Expressions, enh *Ev
// === resets(Matrix parser.ValueTypeMatrix) Vector ===
func funcResets(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
samples := vals[0].(Matrix)[0]
floats := vals[0].(Matrix)[0].Floats
histograms := vals[0].(Matrix)[0].Histograms
resets := 0
prev := samples.Points[0].V
for _, sample := range samples.Points[1:] {
current := sample.V
if current < prev {
resets++
if len(floats) > 1 {
prev := floats[0].F
for _, sample := range floats[1:] {
current := sample.F
if current < prev {
resets++
}
prev = current
}
prev = current
}
return append(enh.Out, Sample{
Point: Point{V: float64(resets)},
})
if len(histograms) > 1 {
prev := histograms[0].H
for _, sample := range histograms[1:] {
current := sample.H
if current.DetectReset(prev) {
resets++
}
prev = current
}
}
return append(enh.Out, Sample{F: float64(resets)})
}
// === changes(Matrix parser.ValueTypeMatrix) Vector ===
func funcChanges(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) Vector {
samples := vals[0].(Matrix)[0]
floats := vals[0].(Matrix)[0].Floats
changes := 0
prev := samples.Points[0].V
for _, sample := range samples.Points[1:] {
current := sample.V
if len(floats) == 0 {
// TODO(beorn7): Only histogram values, still need to add support.
return enh.Out
}
prev := floats[0].F
for _, sample := range floats[1:] {
current := sample.F
if current != prev && !(math.IsNaN(current) && math.IsNaN(prev)) {
changes++
}
prev = current
}
return append(enh.Out, Sample{
Point: Point{V: float64(changes)},
})
return append(enh.Out, Sample{F: float64(changes)})
}
// === label_replace(Vector parser.ValueTypeVector, dst_label, replacement, src_labelname, regex parser.ValueTypeString) Vector ===
@ -1087,7 +1152,8 @@ func funcLabelReplace(vals []parser.Value, args parser.Expressions, enh *EvalNod
enh.Out = append(enh.Out, Sample{
Metric: outMetric,
Point: Point{V: el.Point.V},
F: el.F,
H: el.H,
})
}
return enh.Out
@ -1098,7 +1164,7 @@ func funcVector(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelpe
return append(enh.Out,
Sample{
Metric: labels.Labels{},
Point: Point{V: vals[0].(Vector)[0].V},
F: vals[0].(Vector)[0].F,
})
}
@ -1154,7 +1220,8 @@ func funcLabelJoin(vals []parser.Value, args parser.Expressions, enh *EvalNodeHe
enh.Out = append(enh.Out, Sample{
Metric: outMetric,
Point: Point{V: el.Point.V},
F: el.F,
H: el.H,
})
}
return enh.Out
@ -1166,15 +1233,15 @@ func dateWrapper(vals []parser.Value, enh *EvalNodeHelper, f func(time.Time) flo
return append(enh.Out,
Sample{
Metric: labels.Labels{},
Point: Point{V: f(time.Unix(enh.Ts/1000, 0).UTC())},
F: f(time.Unix(enh.Ts/1000, 0).UTC()),
})
}
for _, el := range vals[0].(Vector) {
t := time.Unix(int64(el.V), 0).UTC()
t := time.Unix(int64(el.F), 0).UTC()
enh.Out = append(enh.Out, Sample{
Metric: enh.DropMetricName(el.Metric),
Point: Point{V: f(t)},
F: f(t),
})
}
return enh.Out
@ -1332,10 +1399,20 @@ func (s vectorByValueHeap) Len() int {
}
func (s vectorByValueHeap) Less(i, j int) bool {
if math.IsNaN(s[i].V) {
// We compare histograms based on their sum of observations.
// TODO(beorn7): Is that what we want?
vi, vj := s[i].F, s[j].F
if s[i].H != nil {
vi = s[i].H.Sum
}
if s[j].H != nil {
vj = s[j].H.Sum
}
if math.IsNaN(vi) {
return true
}
return s[i].V < s[j].V
return vi < vj
}
func (s vectorByValueHeap) Swap(i, j int) {
@ -1361,10 +1438,20 @@ func (s vectorByReverseValueHeap) Len() int {
}
func (s vectorByReverseValueHeap) Less(i, j int) bool {
if math.IsNaN(s[i].V) {
// We compare histograms based on their sum of observations.
// TODO(beorn7): Is that what we want?
vi, vj := s[i].F, s[j].F
if s[i].H != nil {
vi = s[i].H.Sum
}
if s[j].H != nil {
vj = s[j].H.Sum
}
if math.IsNaN(vi) {
return true
}
return s[i].V > s[j].V
return vi > vj
}
func (s vectorByReverseValueHeap) Swap(i, j int) {

2
promql/functions_test.go

@ -64,7 +64,7 @@ func TestDeriv(t *testing.T) {
vec, _ := result.Vector()
require.Equal(t, 1, len(vec), "Expected 1 result, got %d", len(vec))
require.Equal(t, 0.0, vec[0].V, "Expected 0.0 as value, got %f", vec[0].V)
require.Equal(t, 0.0, vec[0].F, "Expected 0.0 as value, got %f", vec[0].F)
}
func TestFunctionList(t *testing.T) {

2
promql/quantile.go

@ -382,5 +382,5 @@ func quantile(q float64, values vectorByValueHeap) float64 {
upperIndex := math.Min(n-1, lowerIndex+1)
weight := rank - math.Floor(rank)
return values[int(lowerIndex)].V*(1-weight) + values[int(upperIndex)].V*weight
return values[int(lowerIndex)].F*(1-weight) + values[int(upperIndex)].F*weight
}

26
promql/test.go

@ -281,7 +281,7 @@ func (*evalCmd) testCmd() {}
type loadCmd struct {
gap time.Duration
metrics map[uint64]labels.Labels
defs map[uint64][]Point
defs map[uint64][]FPoint
exemplars map[uint64][]exemplar.Exemplar
}
@ -289,7 +289,7 @@ func newLoadCmd(gap time.Duration) *loadCmd {
return &loadCmd{
gap: gap,
metrics: map[uint64]labels.Labels{},
defs: map[uint64][]Point{},
defs: map[uint64][]FPoint{},
exemplars: map[uint64][]exemplar.Exemplar{},
}
}
@ -302,13 +302,13 @@ func (cmd loadCmd) String() string {
func (cmd *loadCmd) set(m labels.Labels, vals ...parser.SequenceValue) {
h := m.Hash()
samples := make([]Point, 0, len(vals))
samples := make([]FPoint, 0, len(vals))
ts := testStartTime
for _, v := range vals {
if !v.Omitted {
samples = append(samples, Point{
samples = append(samples, FPoint{
T: ts.UnixNano() / int64(time.Millisecond/time.Nanosecond),
V: v.Value,
F: v.Value,
})
}
ts = ts.Add(cmd.gap)
@ -323,7 +323,7 @@ func (cmd *loadCmd) append(a storage.Appender) error {
m := cmd.metrics[h]
for _, s := range smpls {
if _, err := a.Append(0, m, s.T, s.V); err != nil {
if _, err := a.Append(0, m, s.T, s.F); err != nil {
return err
}
}
@ -399,8 +399,8 @@ func (ev *evalCmd) compareResult(result parser.Value) error {
if ev.ordered && exp.pos != pos+1 {
return fmt.Errorf("expected metric %s with %v at position %d but was at %d", v.Metric, exp.vals, exp.pos, pos+1)
}
if !almostEqual(exp.vals[0].Value, v.V) {
return fmt.Errorf("expected %v for %s but got %v", exp.vals[0].Value, v.Metric, v.V)
if !almostEqual(exp.vals[0].Value, v.F) {
return fmt.Errorf("expected %v for %s but got %v", exp.vals[0].Value, v.Metric, v.F)
}
seen[fp] = true
@ -409,7 +409,7 @@ func (ev *evalCmd) compareResult(result parser.Value) error {
if !seen[fp] {
fmt.Println("vector result", len(val), ev.expr)
for _, ss := range val {
fmt.Println(" ", ss.Metric, ss.Point)
fmt.Println(" ", ss.Metric, ss.T, ss.F)
}
return fmt.Errorf("expected metric %s with %v not found", ev.metrics[fp], expVals)
}
@ -576,15 +576,15 @@ func (t *Test) exec(tc testCommand) error {
mat := rangeRes.Value.(Matrix)
vec := make(Vector, 0, len(mat))
for _, series := range mat {
for _, point := range series.Points {
for _, point := range series.Floats {
if point.T == timeMilliseconds(iq.evalTime) {
vec = append(vec, Sample{Metric: series.Metric, Point: point})
vec = append(vec, Sample{Metric: series.Metric, T: point.T, F: point.F})
break
}
}
}
if _, ok := res.Value.(Scalar); ok {
err = cmd.compareResult(Scalar{V: vec[0].Point.V})
err = cmd.compareResult(Scalar{V: vec[0].F})
} else {
err = cmd.compareResult(vec)
}
@ -763,7 +763,7 @@ func (ll *LazyLoader) appendTill(ts int64) error {
ll.loadCmd.defs[h] = smpls[i:]
break
}
if _, err := app.Append(0, m, s.T, s.V); err != nil {
if _, err := app.Append(0, m, s.T, s.F); err != nil {
return err
}
if i == len(smpls)-1 {

22
promql/test_test.go

@ -47,8 +47,8 @@ func TestLazyLoader_WithSamplesTill(t *testing.T) {
series: []Series{
{
Metric: labels.FromStrings("__name__", "metric1"),
Points: []Point{
{0, 1, nil}, {10000, 2, nil}, {20000, 3, nil}, {30000, 4, nil}, {40000, 5, nil},
Floats: []FPoint{
{0, 1}, {10000, 2}, {20000, 3}, {30000, 4}, {40000, 5},
},
},
},
@ -58,8 +58,8 @@ func TestLazyLoader_WithSamplesTill(t *testing.T) {
series: []Series{
{
Metric: labels.FromStrings("__name__", "metric1"),
Points: []Point{
{0, 1, nil}, {10000, 2, nil}, {20000, 3, nil}, {30000, 4, nil}, {40000, 5, nil},
Floats: []FPoint{
{0, 1}, {10000, 2}, {20000, 3}, {30000, 4}, {40000, 5},
},
},
},
@ -69,8 +69,8 @@ func TestLazyLoader_WithSamplesTill(t *testing.T) {
series: []Series{
{
Metric: labels.FromStrings("__name__", "metric1"),
Points: []Point{
{0, 1, nil}, {10000, 2, nil}, {20000, 3, nil}, {30000, 4, nil}, {40000, 5, nil}, {50000, 6, nil}, {60000, 7, nil},
Floats: []FPoint{
{0, 1}, {10000, 2}, {20000, 3}, {30000, 4}, {40000, 5}, {50000, 6}, {60000, 7},
},
},
},
@ -89,14 +89,14 @@ func TestLazyLoader_WithSamplesTill(t *testing.T) {
series: []Series{
{
Metric: labels.FromStrings("__name__", "metric1"),
Points: []Point{
{0, 1, nil}, {10000, 1, nil}, {20000, 1, nil}, {30000, 1, nil}, {40000, 1, nil}, {50000, 1, nil},
Floats: []FPoint{
{0, 1}, {10000, 1}, {20000, 1}, {30000, 1}, {40000, 1}, {50000, 1},
},
},
{
Metric: labels.FromStrings("__name__", "metric2"),
Points: []Point{
{0, 1, nil}, {10000, 2, nil}, {20000, 3, nil}, {30000, 4, nil}, {40000, 5, nil}, {50000, 6, nil}, {60000, 7, nil}, {70000, 8, nil},
Floats: []FPoint{
{0, 1}, {10000, 2}, {20000, 3}, {30000, 4}, {40000, 5}, {50000, 6}, {60000, 7}, {70000, 8},
},
},
},
@ -146,7 +146,7 @@ func TestLazyLoader_WithSamplesTill(t *testing.T) {
it := storageSeries.Iterator(nil)
for it.Next() == chunkenc.ValFloat {
t, v := it.At()
got.Points = append(got.Points, Point{T: t, V: v})
got.Floats = append(got.Floats, FPoint{T: t, F: v})
}
require.NoError(t, it.Err())

239
promql/value.go

@ -17,6 +17,7 @@ import (
"encoding/json"
"errors"
"fmt"
"math"
"strconv"
"strings"
@ -64,76 +65,72 @@ func (s Scalar) MarshalJSON() ([]byte, error) {
// Series is a stream of data points belonging to a metric.
type Series struct {
Metric labels.Labels
Points []Point
Metric labels.Labels `json:"metric"`
Floats []FPoint `json:"values,omitempty"`
Histograms []HPoint `json:"histograms,omitempty"`
}
func (s Series) String() string {
vals := make([]string, len(s.Points))
for i, v := range s.Points {
vals[i] = v.String()
// TODO(beorn7): This currently renders floats first and then
// histograms, each sorted by timestamp. Maybe, in mixed series, that's
// fine. Maybe, however, primary sorting by timestamp is preferred, in
// which case this has to be changed.
vals := make([]string, 0, len(s.Floats)+len(s.Histograms))
for _, f := range s.Floats {
vals = append(vals, f.String())
}
for _, h := range s.Histograms {
vals = append(vals, h.String())
}
return fmt.Sprintf("%s =>\n%s", s.Metric, strings.Join(vals, "\n"))
}
// MarshalJSON is mirrored in web/api/v1/api.go for efficiency reasons.
// This implementation is still provided for debug purposes and usage
// without jsoniter.
func (s Series) MarshalJSON() ([]byte, error) {
// Note that this is rather inefficient because it re-creates the whole
// series, just separated by Histogram Points and Value Points. For API
// purposes, there is a more efficient jsoniter implementation in
// web/api/v1/api.go.
series := struct {
M labels.Labels `json:"metric"`
V []Point `json:"values,omitempty"`
H []Point `json:"histograms,omitempty"`
}{
M: s.Metric,
}
for _, p := range s.Points {
if p.H == nil {
series.V = append(series.V, p)
continue
}
series.H = append(series.H, p)
}
return json.Marshal(series)
// FPoint represents a single float data point for a given timestamp.
type FPoint struct {
T int64
F float64
}
func (p FPoint) String() string {
s := strconv.FormatFloat(p.F, 'f', -1, 64)
return fmt.Sprintf("%s @[%v]", s, p.T)
}
// MarshalJSON implements json.Marshaler.
//
// JSON marshaling is only needed for the HTTP API. Since FPoint is such a
// frequently marshaled type, it gets an optimized treatment directly in
// web/api/v1/api.go. Therefore, this method is unused within Prometheus. It is
// still provided here as convenience for debugging and for other users of this
// code. Also note that the different marshaling implementations might lead to
// slightly different results in terms of formatting and rounding of the
// timestamp.
func (p FPoint) MarshalJSON() ([]byte, error) {
v := strconv.FormatFloat(p.F, 'f', -1, 64)
return json.Marshal([...]interface{}{float64(p.T) / 1000, v})
}
// Point represents a single data point for a given timestamp.
// If H is not nil, then this is a histogram point and only (T, H) is valid.
// If H is nil, then only (T, V) is valid.
type Point struct {
// HPoint represents a single histogram data point for a given timestamp.
// H must never be nil.
type HPoint struct {
T int64
V float64
H *histogram.FloatHistogram
}
func (p Point) String() string {
var s string
if p.H != nil {
s = p.H.String()
} else {
s = strconv.FormatFloat(p.V, 'f', -1, 64)
}
return fmt.Sprintf("%s @[%v]", s, p.T)
func (p HPoint) String() string {
return fmt.Sprintf("%s @[%v]", p.H.String(), p.T)
}
// MarshalJSON implements json.Marshaler.
//
// JSON marshaling is only needed for the HTTP API. Since Point is such a
// JSON marshaling is only needed for the HTTP API. Since HPoint is such a
// frequently marshaled type, it gets an optimized treatment directly in
// web/api/v1/api.go. Therefore, this method is unused within Prometheus. It is
// still provided here as convenience for debugging and for other users of this
// code. Also note that the different marshaling implementations might lead to
// slightly different results in terms of formatting and rounding of the
// timestamp.
func (p Point) MarshalJSON() ([]byte, error) {
if p.H == nil {
v := strconv.FormatFloat(p.V, 'f', -1, 64)
return json.Marshal([...]interface{}{float64(p.T) / 1000, v})
}
func (p HPoint) MarshalJSON() ([]byte, error) {
h := struct {
Count string `json:"count"`
Sum string `json:"sum"`
@ -171,42 +168,54 @@ func (p Point) MarshalJSON() ([]byte, error) {
return json.Marshal([...]interface{}{float64(p.T) / 1000, h})
}
// Sample is a single sample belonging to a metric.
// Sample is a single sample belonging to a metric. It represents either a float
// sample or a histogram sample. If H is nil, it is a float sample. Otherwise,
// it is a histogram sample.
type Sample struct {
Point
T int64
F float64
H *histogram.FloatHistogram
Metric labels.Labels
}
func (s Sample) String() string {
return fmt.Sprintf("%s => %s", s.Metric, s.Point)
var str string
if s.H == nil {
p := FPoint{T: s.T, F: s.F}
str = p.String()
} else {
p := HPoint{T: s.T, H: s.H}
str = p.String()
}
return fmt.Sprintf("%s => %s", s.Metric, str)
}
// MarshalJSON is mirrored in web/api/v1/api.go with jsoniter because Point
// wouldn't be marshaled with jsoniter in all cases otherwise.
// MarshalJSON is mirrored in web/api/v1/api.go with jsoniter because FPoint and
// HPoint wouldn't be marshaled with jsoniter otherwise.
func (s Sample) MarshalJSON() ([]byte, error) {
if s.Point.H == nil {
v := struct {
if s.H == nil {
f := struct {
M labels.Labels `json:"metric"`
V Point `json:"value"`
F FPoint `json:"value"`
}{
M: s.Metric,
V: s.Point,
F: FPoint{T: s.T, F: s.F},
}
return json.Marshal(v)
return json.Marshal(f)
}
h := struct {
M labels.Labels `json:"metric"`
H Point `json:"histogram"`
H HPoint `json:"histogram"`
}{
M: s.Metric,
H: s.Point,
H: HPoint{T: s.T, H: s.H},
}
return json.Marshal(h)
}
// Vector is basically only an alias for model.Samples, but the
// contract is that in a Vector, all Samples have the same timestamp.
// Vector is basically only an an alias for []Sample, but the contract is that
// in a Vector, all Samples have the same timestamp.
type Vector []Sample
func (vec Vector) String() string {
@ -258,7 +267,7 @@ func (m Matrix) String() string {
func (m Matrix) TotalSamples() int {
numSamples := 0
for _, series := range m {
numSamples += len(series.Points)
numSamples += len(series.Floats) + len(series.Histograms)
}
return numSamples
}
@ -362,7 +371,8 @@ func (ss *StorageSeries) Labels() labels.Labels {
return ss.series.Metric
}
// Iterator returns a new iterator of the data of the series.
// Iterator returns a new iterator of the data of the series. In case of
// multiple samples with the same timestamp, it returns the float samples first.
func (ss *StorageSeries) Iterator(it chunkenc.Iterator) chunkenc.Iterator {
if ssi, ok := it.(*storageSeriesIterator); ok {
ssi.reset(ss.series)
@ -372,44 +382,51 @@ func (ss *StorageSeries) Iterator(it chunkenc.Iterator) chunkenc.Iterator {
}
type storageSeriesIterator struct {
points []Point
curr int
floats []FPoint
histograms []HPoint
iFloats, iHistograms int
currT int64
currF float64
currH *histogram.FloatHistogram
}
func newStorageSeriesIterator(series Series) *storageSeriesIterator {
return &storageSeriesIterator{
points: series.Points,
curr: -1,
floats: series.Floats,
histograms: series.Histograms,
iFloats: -1,
iHistograms: 0,
currT: math.MinInt64,
}
}
func (ssi *storageSeriesIterator) reset(series Series) {
ssi.points = series.Points
ssi.curr = -1
ssi.floats = series.Floats
ssi.histograms = series.Histograms
ssi.iFloats = -1
ssi.iHistograms = 0
ssi.currT = math.MinInt64
ssi.currF = 0
ssi.currH = nil
}
func (ssi *storageSeriesIterator) Seek(t int64) chunkenc.ValueType {
i := ssi.curr
if i < 0 {
i = 0
if ssi.iFloats >= len(ssi.floats) && ssi.iHistograms >= len(ssi.histograms) {
return chunkenc.ValNone
}
for ; i < len(ssi.points); i++ {
p := ssi.points[i]
if p.T >= t {
ssi.curr = i
if p.H != nil {
return chunkenc.ValFloatHistogram
}
return chunkenc.ValFloat
for ssi.currT < t {
if ssi.Next() == chunkenc.ValNone {
return chunkenc.ValNone
}
}
ssi.curr = len(ssi.points) - 1
return chunkenc.ValNone
if ssi.currH != nil {
return chunkenc.ValFloatHistogram
}
return chunkenc.ValFloat
}
func (ssi *storageSeriesIterator) At() (t int64, v float64) {
p := ssi.points[ssi.curr]
return p.T, p.V
return ssi.currT, ssi.currF
}
func (ssi *storageSeriesIterator) AtHistogram() (int64, *histogram.Histogram) {
@ -417,25 +434,59 @@ func (ssi *storageSeriesIterator) AtHistogram() (int64, *histogram.Histogram) {
}
func (ssi *storageSeriesIterator) AtFloatHistogram() (int64, *histogram.FloatHistogram) {
p := ssi.points[ssi.curr]
return p.T, p.H
return ssi.currT, ssi.currH
}
func (ssi *storageSeriesIterator) AtT() int64 {
p := ssi.points[ssi.curr]
return p.T
return ssi.currT
}
func (ssi *storageSeriesIterator) Next() chunkenc.ValueType {
ssi.curr++
if ssi.curr >= len(ssi.points) {
return chunkenc.ValNone
if ssi.currH != nil {
ssi.iHistograms++
} else {
ssi.iFloats++
}
p := ssi.points[ssi.curr]
if p.H != nil {
var (
pickH, pickF = false, false
floatsExhausted = ssi.iFloats >= len(ssi.floats)
histogramsExhausted = ssi.iHistograms >= len(ssi.histograms)
)
switch {
case floatsExhausted:
if histogramsExhausted { // Both exhausted!
return chunkenc.ValNone
}
pickH = true
case histogramsExhausted: // and floats not exhausted.
pickF = true
// From here on, we have to look at timestamps.
case ssi.histograms[ssi.iHistograms].T < ssi.floats[ssi.iFloats].T:
// Next histogram comes before next float.
pickH = true
default:
// In all other cases, we pick float so that we first iterate
// through floats if the timestamp is the same.
pickF = true
}
switch {
case pickF:
p := ssi.floats[ssi.iFloats]
ssi.currT = p.T
ssi.currF = p.F
ssi.currH = nil
return chunkenc.ValFloat
case pickH:
p := ssi.histograms[ssi.iHistograms]
ssi.currT = p.T
ssi.currF = 0
ssi.currH = p.H
return chunkenc.ValFloatHistogram
default:
panic("storageSeriesIterater.Next failed to pick value type")
}
return chunkenc.ValFloat
}
func (ssi *storageSeriesIterator) Err() error {

10
rules/alerting.go

@ -235,7 +235,8 @@ func (r *AlertingRule) sample(alert *Alert, ts time.Time) promql.Sample {
s := promql.Sample{
Metric: lb.Labels(),
Point: promql.Point{T: timestamp.FromTime(ts), V: 1},
T: timestamp.FromTime(ts),
F: 1,
}
return s
}
@ -253,7 +254,8 @@ func (r *AlertingRule) forStateSample(alert *Alert, ts time.Time, v float64) pro
s := promql.Sample{
Metric: lb.Labels(),
Point: promql.Point{T: timestamp.FromTime(ts), V: v},
T: timestamp.FromTime(ts),
F: v,
}
return s
}
@ -339,7 +341,7 @@ func (r *AlertingRule) Eval(ctx context.Context, ts time.Time, query QueryFunc,
// Provide the alert information to the template.
l := smpl.Metric.Map()
tmplData := template.AlertTemplateData(l, r.externalLabels, r.externalURL, smpl.V)
tmplData := template.AlertTemplateData(l, r.externalLabels, r.externalURL, smpl.F)
// Inject some convenience variables that are easier to remember for users
// who are not used to Go's templating system.
defs := []string{
@ -394,7 +396,7 @@ func (r *AlertingRule) Eval(ctx context.Context, ts time.Time, query QueryFunc,
Annotations: annotations,
ActiveAt: ts,
State: StatePending,
Value: smpl.V,
Value: smpl.F,
}
}

46
rules/alerting_test.go

@ -109,7 +109,7 @@ func TestAlertingRuleLabelsUpdate(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -122,7 +122,7 @@ func TestAlertingRuleLabelsUpdate(t *testing.T) {
"job", "app-server",
"severity", "warning",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -135,7 +135,7 @@ func TestAlertingRuleLabelsUpdate(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -148,7 +148,7 @@ func TestAlertingRuleLabelsUpdate(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
},
}
@ -157,7 +157,7 @@ func TestAlertingRuleLabelsUpdate(t *testing.T) {
for i, result := range results {
t.Logf("case %d", i)
evalTime := baseTime.Add(time.Duration(i) * time.Minute)
result[0].Point.T = timestamp.FromTime(evalTime)
result[0].T = timestamp.FromTime(evalTime)
res, err := rule.Eval(suite.Context(), evalTime, EngineQueryFunc(suite.QueryEngine(), suite.Storage()), nil, 0)
require.NoError(t, err)
@ -225,7 +225,7 @@ func TestAlertingRuleExternalLabelsInTemplate(t *testing.T) {
"job", "app-server",
"templated_label", "There are 0 external Labels, of which foo is .",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -236,13 +236,13 @@ func TestAlertingRuleExternalLabelsInTemplate(t *testing.T) {
"job", "app-server",
"templated_label", "There are 2 external Labels, of which foo is bar.",
),
Point: promql.Point{V: 1},
F: 1,
},
}
evalTime := time.Unix(0, 0)
result[0].Point.T = timestamp.FromTime(evalTime)
result[1].Point.T = timestamp.FromTime(evalTime)
result[0].T = timestamp.FromTime(evalTime)
result[1].T = timestamp.FromTime(evalTime)
var filteredRes promql.Vector // After removing 'ALERTS_FOR_STATE' samples.
res, err := ruleWithoutExternalLabels.Eval(
@ -321,7 +321,7 @@ func TestAlertingRuleExternalURLInTemplate(t *testing.T) {
"job", "app-server",
"templated_label", "The external URL is .",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -332,13 +332,13 @@ func TestAlertingRuleExternalURLInTemplate(t *testing.T) {
"job", "app-server",
"templated_label", "The external URL is http://localhost:1234.",
),
Point: promql.Point{V: 1},
F: 1,
},
}
evalTime := time.Unix(0, 0)
result[0].Point.T = timestamp.FromTime(evalTime)
result[1].Point.T = timestamp.FromTime(evalTime)
result[0].T = timestamp.FromTime(evalTime)
result[1].T = timestamp.FromTime(evalTime)
var filteredRes promql.Vector // After removing 'ALERTS_FOR_STATE' samples.
res, err := ruleWithoutExternalURL.Eval(
@ -405,12 +405,12 @@ func TestAlertingRuleEmptyLabelFromTemplate(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
}
evalTime := time.Unix(0, 0)
result[0].Point.T = timestamp.FromTime(evalTime)
result[0].T = timestamp.FromTime(evalTime)
var filteredRes promql.Vector // After removing 'ALERTS_FOR_STATE' samples.
res, err := rule.Eval(
@ -760,7 +760,7 @@ func TestKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -772,7 +772,7 @@ func TestKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -784,7 +784,7 @@ func TestKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
},
{
@ -796,7 +796,7 @@ func TestKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
},
// From now on the alert should keep firing.
@ -809,7 +809,7 @@ func TestKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
},
},
}
@ -818,7 +818,7 @@ func TestKeepFiringFor(t *testing.T) {
for i, result := range results {
t.Logf("case %d", i)
evalTime := baseTime.Add(time.Duration(i) * time.Minute)
result[0].Point.T = timestamp.FromTime(evalTime)
result[0].T = timestamp.FromTime(evalTime)
res, err := rule.Eval(suite.Context(), evalTime, EngineQueryFunc(suite.QueryEngine(), suite.Storage()), nil, 0)
require.NoError(t, err)
@ -871,11 +871,11 @@ func TestPendingAndKeepFiringFor(t *testing.T) {
"instance", "0",
"job", "app-server",
),
Point: promql.Point{V: 1},
F: 1,
}
baseTime := time.Unix(0, 0)
result.Point.T = timestamp.FromTime(baseTime)
result.T = timestamp.FromTime(baseTime)
res, err := rule.Eval(suite.Context(), baseTime, EngineQueryFunc(suite.QueryEngine(), suite.Storage()), nil, 0)
require.NoError(t, err)

5
rules/manager.go

@ -202,7 +202,8 @@ func EngineQueryFunc(engine *promql.Engine, q storage.Queryable) QueryFunc {
return v, nil
case promql.Scalar:
return promql.Vector{promql.Sample{
Point: promql.Point{T: v.T, V: v.V},
T: v.T,
F: v.V,
Metric: labels.Labels{},
}}, nil
default:
@ -695,7 +696,7 @@ func (g *Group) Eval(ctx context.Context, ts time.Time) {
if s.H != nil {
_, err = app.AppendHistogram(0, s.Metric, s.T, nil, s.H)
} else {
_, err = app.Append(0, s.Metric, s.T, s.V)
_, err = app.Append(0, s.Metric, s.T, s.F)
}
if err != nil {

40
rules/manager_test.go

@ -80,7 +80,7 @@ func TestAlertingRule(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -92,7 +92,7 @@ func TestAlertingRule(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -104,7 +104,7 @@ func TestAlertingRule(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -116,7 +116,7 @@ func TestAlertingRule(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
}
@ -223,7 +223,7 @@ func TestForStateAddSamples(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -234,7 +234,7 @@ func TestForStateAddSamples(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -245,7 +245,7 @@ func TestForStateAddSamples(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
promql.Sample{
Metric: labels.FromStrings(
@ -256,7 +256,7 @@ func TestForStateAddSamples(t *testing.T) {
"job", "app-server",
"severity", "critical",
),
Point: promql.Point{V: 1},
F: 1,
},
}
@ -327,8 +327,8 @@ func TestForStateAddSamples(t *testing.T) {
for i := range test.result {
test.result[i].T = timestamp.FromTime(evalTime)
// Updating the expected 'for' state.
if test.result[i].V >= 0 {
test.result[i].V = forState
if test.result[i].F >= 0 {
test.result[i].F = forState
}
}
require.Equal(t, len(test.result), len(filteredRes), "%d. Number of samples in expected and actual output don't match (%d vs. %d)", i, len(test.result), len(res))
@ -584,29 +584,29 @@ func TestStaleness(t *testing.T) {
metricSample, ok := samples[metric]
require.True(t, ok, "Series %s not returned.", metric)
require.True(t, value.IsStaleNaN(metricSample[2].V), "Appended second sample not as expected. Wanted: stale NaN Got: %x", math.Float64bits(metricSample[2].V))
metricSample[2].V = 42 // require.Equal cannot handle NaN.
require.True(t, value.IsStaleNaN(metricSample[2].F), "Appended second sample not as expected. Wanted: stale NaN Got: %x", math.Float64bits(metricSample[2].F))
metricSample[2].F = 42 // require.Equal cannot handle NaN.
want := map[string][]promql.Point{
metric: {{T: 0, V: 2}, {T: 1000, V: 3}, {T: 2000, V: 42}},
want := map[string][]promql.FPoint{
metric: {{T: 0, F: 2}, {T: 1000, F: 3}, {T: 2000, F: 42}},
}
require.Equal(t, want, samples)
}
// Convert a SeriesSet into a form usable with require.Equal.
func readSeriesSet(ss storage.SeriesSet) (map[string][]promql.Point, error) {
result := map[string][]promql.Point{}
func readSeriesSet(ss storage.SeriesSet) (map[string][]promql.FPoint, error) {
result := map[string][]promql.FPoint{}
var it chunkenc.Iterator
for ss.Next() {
series := ss.At()
points := []promql.Point{}
points := []promql.FPoint{}
it := series.Iterator(it)
for it.Next() == chunkenc.ValFloat {
t, v := it.At()
points = append(points, promql.Point{T: t, V: v})
points = append(points, promql.FPoint{T: t, F: v})
}
name := series.Labels().String()
@ -707,7 +707,7 @@ func TestDeletedRuleMarkedStale(t *testing.T) {
metricSample, ok := samples[metric]
require.True(t, ok, "Series %s not returned.", metric)
require.True(t, value.IsStaleNaN(metricSample[0].V), "Appended sample not as expected. Wanted: stale NaN Got: %x", math.Float64bits(metricSample[0].V))
require.True(t, value.IsStaleNaN(metricSample[0].F), "Appended sample not as expected. Wanted: stale NaN Got: %x", math.Float64bits(metricSample[0].F))
}
func TestUpdate(t *testing.T) {
@ -1134,7 +1134,7 @@ func countStaleNaN(t *testing.T, st storage.Storage) int {
require.True(t, ok, "Series %s not returned.", metric)
for _, s := range metricSample {
if value.IsStaleNaN(s.V) {
if value.IsStaleNaN(s.F) {
c++
}
}

24
rules/recording_test.go

@ -46,11 +46,13 @@ var ruleEvalTestScenarios = []struct {
expected: promql.Vector{
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "1", "label_b", "3"),
Point: promql.Point{V: 1, T: timestamp.FromTime(ruleEvaluationTime)},
F: 1,
T: timestamp.FromTime(ruleEvaluationTime),
},
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "2", "label_b", "4"),
Point: promql.Point{V: 10, T: timestamp.FromTime(ruleEvaluationTime)},
F: 10,
T: timestamp.FromTime(ruleEvaluationTime),
},
},
},
@ -61,11 +63,13 @@ var ruleEvalTestScenarios = []struct {
expected: promql.Vector{
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "1", "label_b", "3", "extra_from_rule", "foo"),
Point: promql.Point{V: 1, T: timestamp.FromTime(ruleEvaluationTime)},
F: 1,
T: timestamp.FromTime(ruleEvaluationTime),
},
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "2", "label_b", "4", "extra_from_rule", "foo"),
Point: promql.Point{V: 10, T: timestamp.FromTime(ruleEvaluationTime)},
F: 10,
T: timestamp.FromTime(ruleEvaluationTime),
},
},
},
@ -76,11 +80,13 @@ var ruleEvalTestScenarios = []struct {
expected: promql.Vector{
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "from_rule", "label_b", "3"),
Point: promql.Point{V: 1, T: timestamp.FromTime(ruleEvaluationTime)},
F: 1,
T: timestamp.FromTime(ruleEvaluationTime),
},
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "from_rule", "label_b", "4"),
Point: promql.Point{V: 10, T: timestamp.FromTime(ruleEvaluationTime)},
F: 10,
T: timestamp.FromTime(ruleEvaluationTime),
},
},
},
@ -91,11 +97,13 @@ var ruleEvalTestScenarios = []struct {
expected: promql.Vector{
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "1", "label_b", "3"),
Point: promql.Point{V: 2, T: timestamp.FromTime(ruleEvaluationTime)},
F: 2,
T: timestamp.FromTime(ruleEvaluationTime),
},
promql.Sample{
Metric: labels.FromStrings("__name__", "test_rule", "label_a", "2", "label_b", "4"),
Point: promql.Point{V: 20, T: timestamp.FromTime(ruleEvaluationTime)},
F: 20,
T: timestamp.FromTime(ruleEvaluationTime),
},
},
},

2
template/template.go

@ -93,7 +93,7 @@ func query(ctx context.Context, q string, ts time.Time, queryFn QueryFunc) (quer
result := make(queryResult, len(vector))
for n, v := range vector {
s := sample{
Value: v.V,
Value: v.F,
Labels: v.Metric.Map(),
}
result[n] = &s

26
template/template_test.go

@ -70,7 +70,7 @@ func TestTemplateExpansion(t *testing.T) {
{
text: "{{ query \"1.5\" | first | value }}",
output: "1.5",
queryResult: promql.Vector{{Point: promql.Point{T: 0, V: 1.5}}},
queryResult: promql.Vector{{T: 0, F: 1.5}},
},
{
// Get value from query.
@ -78,7 +78,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "11",
@ -90,7 +91,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "a",
@ -101,7 +103,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "__value__", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "a",
@ -112,7 +115,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "",
@ -123,7 +127,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "",
@ -133,7 +138,8 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "",
@ -145,10 +151,12 @@ func TestTemplateExpansion(t *testing.T) {
queryResult: promql.Vector{
{
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "b"),
Point: promql.Point{T: 0, V: 21},
T: 0,
F: 21,
}, {
Metric: labels.FromStrings(labels.MetricName, "metric", "instance", "a"),
Point: promql.Point{T: 0, V: 11},
T: 0,
F: 11,
},
},
output: "a:11: b:21: ",

79
util/jsonutil/marshal.go

@ -18,6 +18,8 @@ import (
"strconv"
jsoniter "github.com/json-iterator/go"
"github.com/prometheus/prometheus/model/histogram"
)
// MarshalTimestamp marshals a point timestamp using the passed jsoniter stream.
@ -42,8 +44,8 @@ func MarshalTimestamp(t int64, stream *jsoniter.Stream) {
}
}
// MarshalValue marshals a point value using the passed jsoniter stream.
func MarshalValue(v float64, stream *jsoniter.Stream) {
// MarshalFloat marshals a float value using the passed jsoniter stream.
func MarshalFloat(v float64, stream *jsoniter.Stream) {
stream.WriteRaw(`"`)
// Taken from https://github.com/json-iterator/go/blob/master/stream_float.go#L71 as a workaround
// to https://github.com/json-iterator/go/issues/365 (jsoniter, to follow json standard, doesn't allow inf/nan).
@ -60,3 +62,76 @@ func MarshalValue(v float64, stream *jsoniter.Stream) {
stream.SetBuffer(buf)
stream.WriteRaw(`"`)
}
// MarshalHistogram marshals a histogram value using the passed jsoniter stream.
// It writes something like:
//
// {
// "count": "42",
// "sum": "34593.34",
// "buckets": [
// [ 3, "-0.25", "0.25", "3"],
// [ 0, "0.25", "0.5", "12"],
// [ 0, "0.5", "1", "21"],
// [ 0, "2", "4", "6"]
// ]
// }
//
// The 1st element in each bucket array determines if the boundaries are
// inclusive (AKA closed) or exclusive (AKA open):
//
// 0: lower exclusive, upper inclusive
// 1: lower inclusive, upper exclusive
// 2: both exclusive
// 3: both inclusive
//
// The 2nd and 3rd elements are the lower and upper boundary. The 4th element is
// the bucket count.
func MarshalHistogram(h *histogram.FloatHistogram, stream *jsoniter.Stream) {
stream.WriteObjectStart()
stream.WriteObjectField(`count`)
MarshalFloat(h.Count, stream)
stream.WriteMore()
stream.WriteObjectField(`sum`)
MarshalFloat(h.Sum, stream)
bucketFound := false
it := h.AllBucketIterator()
for it.Next() {
bucket := it.At()
if bucket.Count == 0 {
continue // No need to expose empty buckets in JSON.
}
stream.WriteMore()
if !bucketFound {
stream.WriteObjectField(`buckets`)
stream.WriteArrayStart()
}
bucketFound = true
boundaries := 2 // Exclusive on both sides AKA open interval.
if bucket.LowerInclusive {
if bucket.UpperInclusive {
boundaries = 3 // Inclusive on both sides AKA closed interval.
} else {
boundaries = 1 // Inclusive only on lower end AKA right open.
}
} else {
if bucket.UpperInclusive {
boundaries = 0 // Inclusive only on upper end AKA left open.
}
}
stream.WriteArrayStart()
stream.WriteInt(boundaries)
stream.WriteMore()
MarshalFloat(bucket.Lower, stream)
stream.WriteMore()
MarshalFloat(bucket.Upper, stream)
stream.WriteMore()
MarshalFloat(bucket.Count, stream)
stream.WriteArrayEnd()
}
if bucketFound {
stream.WriteArrayEnd()
}
stream.WriteObjectEnd()
}

163
web/api/v1/api.go

@ -41,7 +41,6 @@ import (
"github.com/prometheus/prometheus/config"
"github.com/prometheus/prometheus/model/exemplar"
"github.com/prometheus/prometheus/model/histogram"
"github.com/prometheus/prometheus/model/labels"
"github.com/prometheus/prometheus/model/textparse"
"github.com/prometheus/prometheus/model/timestamp"
@ -217,7 +216,8 @@ type API struct {
func init() {
jsoniter.RegisterTypeEncoderFunc("promql.Series", marshalSeriesJSON, marshalSeriesJSONIsEmpty)
jsoniter.RegisterTypeEncoderFunc("promql.Sample", marshalSampleJSON, marshalSampleJSONIsEmpty)
jsoniter.RegisterTypeEncoderFunc("promql.Point", marshalPointJSON, marshalPointJSONIsEmpty)
jsoniter.RegisterTypeEncoderFunc("promql.FPoint", marshalFPointJSON, marshalPointJSONIsEmpty)
jsoniter.RegisterTypeEncoderFunc("promql.HPoint", marshalHPointJSON, marshalPointJSONIsEmpty)
jsoniter.RegisterTypeEncoderFunc("exemplar.Exemplar", marshalExemplarJSON, marshalExemplarJSONEmpty)
}
@ -1724,7 +1724,7 @@ OUTER:
// < more values>
// ],
// "histograms": [
// [ 1435781451.781, { < histogram, see below > } ],
// [ 1435781451.781, { < histogram, see jsonutil.MarshalHistogram > } ],
// < more histograms >
// ],
// },
@ -1739,41 +1739,26 @@ func marshalSeriesJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
}
stream.SetBuffer(append(stream.Buffer(), m...))
// We make two passes through the series here: In the first marshaling
// all value points, in the second marshaling all histogram
// points. That's probably cheaper than just one pass in which we copy
// out histogram Points into a newly allocated slice for separate
// marshaling. (Could be benchmarked, though.)
var foundValue, foundHistogram bool
for _, p := range s.Points {
if p.H == nil {
stream.WriteMore()
if !foundValue {
stream.WriteObjectField(`values`)
stream.WriteArrayStart()
}
foundValue = true
marshalPointJSON(unsafe.Pointer(&p), stream)
} else {
foundHistogram = true
for i, p := range s.Floats {
stream.WriteMore()
if i == 0 {
stream.WriteObjectField(`values`)
stream.WriteArrayStart()
}
marshalFPointJSON(unsafe.Pointer(&p), stream)
}
if foundValue {
if len(s.Floats) > 0 {
stream.WriteArrayEnd()
}
if foundHistogram {
firstHistogram := true
for _, p := range s.Points {
if p.H != nil {
stream.WriteMore()
if firstHistogram {
stream.WriteObjectField(`histograms`)
stream.WriteArrayStart()
}
firstHistogram = false
marshalPointJSON(unsafe.Pointer(&p), stream)
}
for i, p := range s.Histograms {
stream.WriteMore()
if i == 0 {
stream.WriteObjectField(`histograms`)
stream.WriteArrayStart()
}
marshalHPointJSON(unsafe.Pointer(&p), stream)
}
if len(s.Histograms) > 0 {
stream.WriteArrayEnd()
}
stream.WriteObjectEnd()
@ -1791,7 +1776,7 @@ func marshalSeriesJSONIsEmpty(ptr unsafe.Pointer) bool {
// "job" : "prometheus",
// "instance" : "localhost:9090"
// },
// "value": [ 1435781451.781, "1" ]
// "value": [ 1435781451.781, "1.234" ]
// },
//
// For histogram samples, it writes something like this:
@ -1802,7 +1787,7 @@ func marshalSeriesJSONIsEmpty(ptr unsafe.Pointer) bool {
// "job" : "prometheus",
// "instance" : "localhost:9090"
// },
// "histogram": [ 1435781451.781, { < histogram, see below > } ]
// "histogram": [ 1435781451.781, { < histogram, see jsonutil.MarshalHistogram > } ]
// },
func marshalSampleJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
s := *((*promql.Sample)(ptr))
@ -1815,12 +1800,20 @@ func marshalSampleJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
}
stream.SetBuffer(append(stream.Buffer(), m...))
stream.WriteMore()
if s.Point.H == nil {
if s.H == nil {
stream.WriteObjectField(`value`)
} else {
stream.WriteObjectField(`histogram`)
}
marshalPointJSON(unsafe.Pointer(&s.Point), stream)
stream.WriteArrayStart()
jsonutil.MarshalTimestamp(s.T, stream)
stream.WriteMore()
if s.H == nil {
jsonutil.MarshalFloat(s.F, stream)
} else {
jsonutil.MarshalHistogram(s.H, stream)
}
stream.WriteArrayEnd()
stream.WriteObjectEnd()
}
@ -1828,94 +1821,28 @@ func marshalSampleJSONIsEmpty(ptr unsafe.Pointer) bool {
return false
}
// marshalPointJSON writes `[ts, "val"]`.
func marshalPointJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
p := *((*promql.Point)(ptr))
// marshalFPointJSON writes `[ts, "1.234"]`.
func marshalFPointJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
p := *((*promql.FPoint)(ptr))
stream.WriteArrayStart()
jsonutil.MarshalTimestamp(p.T, stream)
stream.WriteMore()
if p.H == nil {
jsonutil.MarshalValue(p.V, stream)
} else {
marshalHistogram(p.H, stream)
}
jsonutil.MarshalFloat(p.F, stream)
stream.WriteArrayEnd()
}
func marshalPointJSONIsEmpty(ptr unsafe.Pointer) bool {
return false
// marshalHPointJSON writes `[ts, { < histogram, see jsonutil.MarshalHistogram > } ]`.
func marshalHPointJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
p := *((*promql.HPoint)(ptr))
stream.WriteArrayStart()
jsonutil.MarshalTimestamp(p.T, stream)
stream.WriteMore()
jsonutil.MarshalHistogram(p.H, stream)
stream.WriteArrayEnd()
}
// marshalHistogramJSON writes something like:
//
// {
// "count": "42",
// "sum": "34593.34",
// "buckets": [
// [ 3, "-0.25", "0.25", "3"],
// [ 0, "0.25", "0.5", "12"],
// [ 0, "0.5", "1", "21"],
// [ 0, "2", "4", "6"]
// ]
// }
//
// The 1st element in each bucket array determines if the boundaries are
// inclusive (AKA closed) or exclusive (AKA open):
//
// 0: lower exclusive, upper inclusive
// 1: lower inclusive, upper exclusive
// 2: both exclusive
// 3: both inclusive
//
// The 2nd and 3rd elements are the lower and upper boundary. The 4th element is
// the bucket count.
func marshalHistogram(h *histogram.FloatHistogram, stream *jsoniter.Stream) {
stream.WriteObjectStart()
stream.WriteObjectField(`count`)
jsonutil.MarshalValue(h.Count, stream)
stream.WriteMore()
stream.WriteObjectField(`sum`)
jsonutil.MarshalValue(h.Sum, stream)
bucketFound := false
it := h.AllBucketIterator()
for it.Next() {
bucket := it.At()
if bucket.Count == 0 {
continue // No need to expose empty buckets in JSON.
}
stream.WriteMore()
if !bucketFound {
stream.WriteObjectField(`buckets`)
stream.WriteArrayStart()
}
bucketFound = true
boundaries := 2 // Exclusive on both sides AKA open interval.
if bucket.LowerInclusive {
if bucket.UpperInclusive {
boundaries = 3 // Inclusive on both sides AKA closed interval.
} else {
boundaries = 1 // Inclusive only on lower end AKA right open.
}
} else {
if bucket.UpperInclusive {
boundaries = 0 // Inclusive only on upper end AKA left open.
}
}
stream.WriteArrayStart()
stream.WriteInt(boundaries)
stream.WriteMore()
jsonutil.MarshalValue(bucket.Lower, stream)
stream.WriteMore()
jsonutil.MarshalValue(bucket.Upper, stream)
stream.WriteMore()
jsonutil.MarshalValue(bucket.Count, stream)
stream.WriteArrayEnd()
}
if bucketFound {
stream.WriteArrayEnd()
}
stream.WriteObjectEnd()
func marshalPointJSONIsEmpty(ptr unsafe.Pointer) bool {
return false
}
// marshalExemplarJSON writes.
@ -1941,7 +1868,7 @@ func marshalExemplarJSON(ptr unsafe.Pointer, stream *jsoniter.Stream) {
// "value" key.
stream.WriteMore()
stream.WriteObjectField(`value`)
jsonutil.MarshalValue(p.Value, stream)
jsonutil.MarshalFloat(p.Value, stream)
// "timestamp" key.
stream.WriteMore()

48
web/api/v1/api_test.go

@ -1102,10 +1102,10 @@ func testEndpoints(t *testing.T, api *API, tr *testTargetRetriever, es storage.E
ResultType: parser.ValueTypeMatrix,
Result: promql.Matrix{
promql.Series{
Points: []promql.Point{
{V: 0, T: timestamp.FromTime(start)},
{V: 1, T: timestamp.FromTime(start.Add(1 * time.Second))},
{V: 2, T: timestamp.FromTime(start.Add(2 * time.Second))},
Floats: []promql.FPoint{
{F: 0, T: timestamp.FromTime(start)},
{F: 1, T: timestamp.FromTime(start.Add(1 * time.Second))},
{F: 2, T: timestamp.FromTime(start.Add(2 * time.Second))},
},
// No Metric returned - use zero value for comparison.
},
@ -3059,7 +3059,7 @@ func TestRespond(t *testing.T) {
ResultType: parser.ValueTypeMatrix,
Result: promql.Matrix{
promql.Series{
Points: []promql.Point{{V: 1, T: 1000}},
Floats: []promql.FPoint{{F: 1, T: 1000}},
Metric: labels.FromStrings("__name__", "foo"),
},
},
@ -3071,7 +3071,7 @@ func TestRespond(t *testing.T) {
ResultType: parser.ValueTypeMatrix,
Result: promql.Matrix{
promql.Series{
Points: []promql.Point{{H: &histogram.FloatHistogram{
Histograms: []promql.HPoint{{H: &histogram.FloatHistogram{
Schema: 2,
ZeroThreshold: 0.001,
ZeroCount: 12,
@ -3094,63 +3094,63 @@ func TestRespond(t *testing.T) {
expected: `{"status":"success","data":{"resultType":"matrix","result":[{"metric":{"__name__":"foo"},"histograms":[[1,{"count":"10","sum":"20","buckets":[[1,"-1.6817928305074288","-1.414213562373095","1"],[1,"-1.414213562373095","-1.189207115002721","2"],[3,"-0.001","0.001","12"],[0,"1.414213562373095","1.6817928305074288","1"],[0,"1.6817928305074288","2","2"],[0,"2.378414230005442","2.82842712474619","2"],[0,"2.82842712474619","3.3635856610148576","1"],[0,"3.3635856610148576","4","1"]]}]]}]}}`,
},
{
response: promql.Point{V: 0, T: 0},
response: promql.FPoint{F: 0, T: 0},
expected: `{"status":"success","data":[0,"0"]}`,
},
{
response: promql.Point{V: 20, T: 1},
response: promql.FPoint{F: 20, T: 1},
expected: `{"status":"success","data":[0.001,"20"]}`,
},
{
response: promql.Point{V: 20, T: 10},
response: promql.FPoint{F: 20, T: 10},
expected: `{"status":"success","data":[0.010,"20"]}`,
},
{
response: promql.Point{V: 20, T: 100},
response: promql.FPoint{F: 20, T: 100},
expected: `{"status":"success","data":[0.100,"20"]}`,
},
{
response: promql.Point{V: 20, T: 1001},
response: promql.FPoint{F: 20, T: 1001},
expected: `{"status":"success","data":[1.001,"20"]}`,
},
{
response: promql.Point{V: 20, T: 1010},
response: promql.FPoint{F: 20, T: 1010},
expected: `{"status":"success","data":[1.010,"20"]}`,
},
{
response: promql.Point{V: 20, T: 1100},
response: promql.FPoint{F: 20, T: 1100},
expected: `{"status":"success","data":[1.100,"20"]}`,
},
{
response: promql.Point{V: 20, T: 12345678123456555},
response: promql.FPoint{F: 20, T: 12345678123456555},
expected: `{"status":"success","data":[12345678123456.555,"20"]}`,
},
{
response: promql.Point{V: 20, T: -1},
response: promql.FPoint{F: 20, T: -1},
expected: `{"status":"success","data":[-0.001,"20"]}`,
},
{
response: promql.Point{V: math.NaN(), T: 0},
response: promql.FPoint{F: math.NaN(), T: 0},
expected: `{"status":"success","data":[0,"NaN"]}`,
},
{
response: promql.Point{V: math.Inf(1), T: 0},
response: promql.FPoint{F: math.Inf(1), T: 0},
expected: `{"status":"success","data":[0,"+Inf"]}`,
},
{
response: promql.Point{V: math.Inf(-1), T: 0},
response: promql.FPoint{F: math.Inf(-1), T: 0},
expected: `{"status":"success","data":[0,"-Inf"]}`,
},
{
response: promql.Point{V: 1.2345678e6, T: 0},
response: promql.FPoint{F: 1.2345678e6, T: 0},
expected: `{"status":"success","data":[0,"1234567.8"]}`,
},
{
response: promql.Point{V: 1.2345678e-6, T: 0},
response: promql.FPoint{F: 1.2345678e-6, T: 0},
expected: `{"status":"success","data":[0,"0.0000012345678"]}`,
},
{
response: promql.Point{V: 1.2345678e-67, T: 0},
response: promql.FPoint{F: 1.2345678e-67, T: 0},
expected: `{"status":"success","data":[0,"1.2345678e-67"]}`,
},
{
@ -3283,15 +3283,15 @@ var testResponseWriter = httptest.ResponseRecorder{}
func BenchmarkRespond(b *testing.B) {
b.ReportAllocs()
points := []promql.Point{}
points := []promql.FPoint{}
for i := 0; i < 10000; i++ {
points = append(points, promql.Point{V: float64(i * 1000000), T: int64(i)})
points = append(points, promql.FPoint{F: float64(i * 1000000), T: int64(i)})
}
response := &queryData{
ResultType: parser.ValueTypeMatrix,
Result: promql.Matrix{
promql.Series{
Points: points,
Floats: points,
Metric: labels.EmptyLabels(),
},
},

6
web/federate.go

@ -145,7 +145,9 @@ Loop:
vec = append(vec, promql.Sample{
Metric: s.Labels(),
Point: promql.Point{T: t, V: v, H: fh},
T: t,
F: v,
H: fh,
})
}
if ws := set.Warnings(); len(ws) > 0 {
@ -262,7 +264,7 @@ Loop:
if !isHistogram {
lastHistogramWasGauge = false
protMetric.Untyped = &dto.Untyped{
Value: proto.Float64(s.V),
Value: proto.Float64(s.F),
}
} else {
lastHistogramWasGauge = s.H.CounterResetHint == histogram.GaugeType

29
web/federate_test.go

@ -342,14 +342,16 @@ func TestFederationWithNativeHistograms(t *testing.T) {
if i%3 == 0 {
_, err = app.Append(0, l, 100*60*1000, float64(i*100))
expVec = append(expVec, promql.Sample{
Point: promql.Point{T: 100 * 60 * 1000, V: float64(i * 100)},
T: 100 * 60 * 1000,
F: float64(i * 100),
Metric: expL,
})
} else {
hist.ZeroCount++
_, err = app.AppendHistogram(0, l, 100*60*1000, hist.Copy(), nil)
expVec = append(expVec, promql.Sample{
Point: promql.Point{T: 100 * 60 * 1000, H: hist.ToFloat()},
T: 100 * 60 * 1000,
H: hist.ToFloat(),
Metric: expL,
})
}
@ -379,6 +381,7 @@ func TestFederationWithNativeHistograms(t *testing.T) {
p := textparse.NewProtobufParser(body)
var actVec promql.Vector
metricFamilies := 0
l := labels.Labels{}
for {
et, err := p.Next()
if err == io.EOF {
@ -389,23 +392,23 @@ func TestFederationWithNativeHistograms(t *testing.T) {
metricFamilies++
}
if et == textparse.EntryHistogram || et == textparse.EntrySeries {
l := labels.Labels{}
p.Metric(&l)
actVec = append(actVec, promql.Sample{Metric: l})
}
if et == textparse.EntryHistogram {
_, parsedTimestamp, h, fh := p.Histogram()
require.Nil(t, h)
actVec[len(actVec)-1].Point = promql.Point{
T: *parsedTimestamp,
H: fh,
}
actVec = append(actVec, promql.Sample{
T: *parsedTimestamp,
H: fh,
Metric: l,
})
} else if et == textparse.EntrySeries {
_, parsedTimestamp, v := p.Series()
actVec[len(actVec)-1].Point = promql.Point{
T: *parsedTimestamp,
V: v,
}
_, parsedTimestamp, f := p.Series()
actVec = append(actVec, promql.Sample{
T: *parsedTimestamp,
F: f,
Metric: l,
})
}
}

Loading…
Cancel
Save