Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: respect iox::column_type::field metadata when mapping query #200

Merged
merged 23 commits into from
Dec 6, 2024
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
6f214d9
feat: respect iox::column_type::field metadata when mapping query
NguyenHoangSon96 Nov 21, 2024
0c91353
chore: update CHANGELOG.mdlog.md
NguyenHoangSon96 Nov 21, 2024
def624c
chore: remove unused file
NguyenHoangSon96 Nov 21, 2024
a3408c0
chore: lint CHANGELOG.md
NguyenHoangSon96 Nov 21, 2024
cf375aa
refactor: remove setTimestamp function
NguyenHoangSon96 Nov 22, 2024
541e234
chore: update CHANGELOG.md
NguyenHoangSon96 Nov 22, 2024
7d35ff0
chore: update CHANGELOG.md and pom.xml release version
NguyenHoangSon96 Nov 25, 2024
95a29cb
chore: add javadoc for getTimestampNano function
NguyenHoangSon96 Nov 25, 2024
088b36d
refactor: change to enhance for loop
NguyenHoangSon96 Nov 25, 2024
2696bc7
fix: build
NguyenHoangSon96 Nov 25, 2024
6c3d2fb
feat: move on if type cast is fail
NguyenHoangSon96 Nov 26, 2024
24bfd5e
fix: linter
NguyenHoangSon96 Nov 26, 2024
1699443
fix: linter
NguyenHoangSon96 Nov 26, 2024
0fdf8e0
fix: linter
NguyenHoangSon96 Nov 26, 2024
c4bb8ab
feat: add getMappedValue function
NguyenHoangSon96 Nov 28, 2024
a796a13
refactor: getMappedValue function
NguyenHoangSon96 Nov 29, 2024
daab545
refactor: getMappedValue function
NguyenHoangSon96 Nov 29, 2024
06e29ed
refactor: getMappedValue function
NguyenHoangSon96 Nov 29, 2024
16ea54f
fix: linter
NguyenHoangSon96 Nov 29, 2024
547c87e
fix: linter
NguyenHoangSon96 Nov 29, 2024
dbaaf99
refactor: change warning message
NguyenHoangSon96 Dec 6, 2024
79159a9
fix: build
NguyenHoangSon96 Dec 6, 2024
fcca2df
refactor: getArrayObjectFromVectorSchemaRoot function
NguyenHoangSon96 Dec 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
## 0.10.0 [unreleased]
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved

### Features

1. [#200](https://github.com/InfluxCommunity/influxdb3-java/pull/200): Respect iox::column_type::field metadata when
mapping query results into values.
- iox::column_type::field::integer: => Long
- iox::column_type::field::uinteger: => Long
- iox::column_type::field::float: => Double
- iox::column_type::field::string: => String
- iox::column_type::field::boolean: => Boolean

## 0.9.0 [2024-08-12]

### Features
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.math.BigInteger;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collections;
Expand All @@ -41,6 +42,7 @@
import io.netty.handler.codec.http.HttpMethod;
import org.apache.arrow.vector.FieldVector;
import org.apache.arrow.vector.VectorSchemaRoot;
import org.apache.arrow.vector.util.Text;

import com.influxdb.v3.client.InfluxDBApiException;
import com.influxdb.v3.client.InfluxDBClient;
Expand Down Expand Up @@ -183,16 +185,48 @@ public Stream<Object[]> query(@Nonnull final String query,
return queryData(query, parameters, options)
.flatMap(vector -> {
List<FieldVector> fieldVectors = vector.getFieldVectors();
return IntStream
.range(0, vector.getRowCount())
.mapToObj(rowNumber -> {

ArrayList<Object> row = new ArrayList<>();
for (FieldVector fieldVector : fieldVectors) {
row.add(fieldVector.getObject(rowNumber));
}
return row.toArray();
});
var fields = vector.getSchema().getFields();
return IntStream.range(0, vector.getRowCount())
.mapToObj(rowNumber -> {
ArrayList<Object> row = new ArrayList<>();
for (int i = 0; i < fieldVectors.size(); i++) {
var schema = fields.get(i);
var metaType = schema.getMetadata().get("iox::column::type");
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
String valueType = metaType != null ? metaType.split("::")[2] : null;

Object value = fieldVectors.get(i).getObject(rowNumber);
if ("field".equals(valueType)) {
switch (metaType) {
case "iox::column_type::field::integer":
case "iox::column_type::field::uinteger":
var intValue = (Long) value;
row.add(intValue);
break;
case "iox::column_type::field::float":
var doubleValue = (Double) value;
row.add(doubleValue);
break;
case "iox::column_type::field::string":
var textValue = (Text) value;
row.add(textValue.toString());
break;
case "iox::column_type::field::boolean":
var boolValue = (Boolean) value;
row.add(boolValue);
break;
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
default:
}
} else if ("timestamp".equals(valueType)
|| Objects.equals(schema.getName(), "time")) {
BigInteger time = NanosecondConverter.getTimestampNano(value, schema);
row.add(time);
} else {
row.add(value);
}
}

return row.toArray();
});
});
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,18 @@
import java.math.BigDecimal;
import java.math.BigInteger;
import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneOffset;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import javax.annotation.Nonnull;
import javax.annotation.Nullable;

import org.apache.arrow.vector.types.pojo.ArrowType;
import org.apache.arrow.vector.types.pojo.Field;

import com.influxdb.v3.client.write.WritePrecision;

import static java.util.function.Function.identity;
Expand Down Expand Up @@ -111,4 +118,49 @@ public static BigInteger convert(final Instant instant, final WritePrecision pre

return FROM_NANOS.get(precision).apply(nanos);
}

public static BigInteger getTimestampNano(@Nonnull final Object value, @Nonnull final Field schema) {
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
BigInteger result = null;

if (value instanceof Long) {
if (schema.getFieldType().getType() instanceof ArrowType.Timestamp) {
ArrowType.Timestamp type = (ArrowType.Timestamp) schema.getFieldType().getType();
TimeUnit timeUnit;
switch (type.getUnit()) {
case SECOND:
timeUnit = TimeUnit.SECONDS;
break;
case MILLISECOND:
timeUnit = TimeUnit.MILLISECONDS;
break;
case MICROSECOND:
timeUnit = TimeUnit.MICROSECONDS;
break;
default:
case NANOSECOND:
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
timeUnit = TimeUnit.NANOSECONDS;
break;
}
long nanoseconds = TimeUnit.NANOSECONDS.convert((Long) value, timeUnit);
Instant instant = Instant.ofEpochSecond(0, nanoseconds);
result = convertInstantToNano(instant, WritePrecision.NS);
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
} else {
Instant instant = Instant.ofEpochMilli((Long) value);
result = convertInstantToNano(instant, WritePrecision.NS);
}
} else if (value instanceof LocalDateTime) {
Instant instant = ((LocalDateTime) value).toInstant(ZoneOffset.UTC);
result = convertInstantToNano(instant, WritePrecision.NS);
}
return result;
}

private static BigInteger convertInstantToNano(final Instant instant, final WritePrecision precision) {
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
var writePrecision = WritePrecision.NS;
if (precision != null) {
writePrecision = precision;
}
BigInteger convertedTime = NanosecondConverter.convert(instant, writePrecision);
return NanosecondConverter.convertToNanos(convertedTime, writePrecision);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -21,22 +21,18 @@
*/
package com.influxdb.v3.client.internal;

import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneOffset;
import java.util.List;
import java.util.Objects;
import java.util.concurrent.TimeUnit;
import javax.annotation.Nonnull;
import javax.annotation.concurrent.ThreadSafe;

import org.apache.arrow.vector.FieldVector;
import org.apache.arrow.vector.VectorSchemaRoot;
import org.apache.arrow.vector.types.pojo.ArrowType;
import org.apache.arrow.vector.types.pojo.Field;
import org.apache.arrow.vector.util.Text;

import com.influxdb.v3.client.PointValues;
import com.influxdb.v3.client.write.WritePrecision;


/**
Expand Down Expand Up @@ -82,7 +78,8 @@

if (metaType == null) {
if (Objects.equals(name, "time") && (value instanceof Long || value instanceof LocalDateTime)) {
setTimestamp(value, schema, p);
var timeNano = NanosecondConverter.getTimestampNano(value, schema);
p.setTimestamp(timeNano, WritePrecision.NS);
} else {
// just push as field If you don't know what type is it
p.setField(name, value);
Expand All @@ -95,45 +92,42 @@
String valueType = parts[2];

if ("field".equals(valueType)) {
p.setField(name, value);
setFieldWithMetaType(p, name, value, metaType);
} else if ("tag".equals(valueType) && value instanceof String) {
p.setTag(name, (String) value);
} else if ("timestamp".equals(valueType)) {
setTimestamp(value, schema, p);
var timeNano = NanosecondConverter.getTimestampNano(value, schema);
p.setTimestamp(timeNano, WritePrecision.NS);
}
}
return p;
}

private void setTimestamp(@Nonnull final Object value,
@Nonnull final Field schema,
@Nonnull final PointValues pointValues) {
if (value instanceof Long) {
if (schema.getFieldType().getType() instanceof ArrowType.Timestamp) {
ArrowType.Timestamp type = (ArrowType.Timestamp) schema.getFieldType().getType();
TimeUnit timeUnit;
switch (type.getUnit()) {
case SECOND:
timeUnit = TimeUnit.SECONDS;
break;
case MILLISECOND:
timeUnit = TimeUnit.MILLISECONDS;
break;
case MICROSECOND:
timeUnit = TimeUnit.MICROSECONDS;
break;
default:
case NANOSECOND:
timeUnit = TimeUnit.NANOSECONDS;
break;
}
long nanoseconds = TimeUnit.NANOSECONDS.convert((Long) value, timeUnit);
pointValues.setTimestamp(Instant.ofEpochSecond(0, nanoseconds));
} else {
pointValues.setTimestamp(Instant.ofEpochMilli((Long) value));
}
} else if (value instanceof LocalDateTime) {
pointValues.setTimestamp(((LocalDateTime) value).toInstant(ZoneOffset.UTC));
private void setFieldWithMetaType(final PointValues p,
final String name,
final Object value,
final String metaType) {
if (value == null) {
return;
}

switch (metaType) {
case "iox::column_type::field::integer":
case "iox::column_type::field::uinteger":
p.setIntegerField(name, (Long) value);
break;
case "iox::column_type::field::float":
p.setFloatField(name, (Double) value);
break;
case "iox::column_type::field::string":
p.setStringField(name, (String) value);
break;
case "iox::column_type::field::boolean":
p.setBooleanField(name, (Boolean) value);
break;
default:
NguyenHoangSon96 marked this conversation as resolved.
Show resolved Hide resolved
p.setField(name, value);

Check warning on line 129 in src/main/java/com/influxdb/v3/client/internal/VectorSchemaRootConverter.java

View check run for this annotation

Codecov / codecov/patch

src/main/java/com/influxdb/v3/client/internal/VectorSchemaRootConverter.java#L129

Added line #L129 was not covered by tests
break;
}
}
}
57 changes: 57 additions & 0 deletions src/test/java/com/influxdb/v3/client/InfluxDBClientTest.java
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,19 @@
*/
package com.influxdb.v3.client;

import java.math.BigInteger;
import java.time.Instant;
import java.util.Map;
import java.util.Properties;
import java.util.UUID;
import java.util.stream.Stream;

import org.assertj.core.api.Assertions;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.condition.EnabledIfEnvironmentVariable;

import com.influxdb.v3.client.write.WriteOptions;
import com.influxdb.v3.client.write.WritePrecision;

class InfluxDBClientTest {

Expand Down Expand Up @@ -116,4 +124,53 @@ public void unsupportedQueryParams() throws Exception {
+ "class com.influxdb.v3.client.internal.InfluxDBClientImpl");
}
}

@EnabledIfEnvironmentVariable(named = "TESTING_INFLUXDB_URL", matches = ".*")
@EnabledIfEnvironmentVariable(named = "TESTING_INFLUXDB_TOKEN", matches = ".*")
@EnabledIfEnvironmentVariable(named = "TESTING_INFLUXDB_DATABASE", matches = ".*")
@Test
public void testQuery() throws Exception {
try (InfluxDBClient client = InfluxDBClient.getInstance(
System.getenv("TESTING_INFLUXDB_URL"),
System.getenv("TESTING_INFLUXDB_TOKEN").toCharArray(),
System.getenv("TESTING_INFLUXDB_DATABASE"),
null)) {
String uuid = UUID.randomUUID().toString();
long timestamp = Instant.now().getEpochSecond();
String record = String.format(
"host10,tag=empty "
+ "name=\"intel\","
+ "mem_total=2048,"
+ "disk_free=100i,"
+ "temperature=100.86,"
+ "isActive=true,"
+ "testId=\"%s\" %d",
uuid,
timestamp
);
client.writeRecord(record, new WriteOptions(null, WritePrecision.S, null));

Map<String, Object> parameters = Map.of("testId", uuid);
String sql = "Select * from host10 where \"testId\"=$testId";
try (Stream<Object[]> stream = client.query(sql, parameters)) {
stream.findFirst()
.ifPresent(objects -> {
Assertions.assertThat(objects[0].getClass()).isEqualTo(Long.class);
Assertions.assertThat(objects[0]).isEqualTo(100L);

Assertions.assertThat(objects[1].getClass()).isEqualTo(Boolean.class);
Assertions.assertThat(objects[1]).isEqualTo(true);

Assertions.assertThat(objects[2].getClass()).isEqualTo(Double.class);
Assertions.assertThat(objects[2]).isEqualTo(2048.0);

Assertions.assertThat(objects[3].getClass()).isEqualTo(String.class);
Assertions.assertThat(objects[3]).isEqualTo("intel");

Assertions.assertThat(objects[7].getClass()).isEqualTo(BigInteger.class);
Assertions.assertThat(objects[7]).isEqualTo(BigInteger.valueOf(timestamp * 1_000_000_000));
});
}
}
}
}
Loading
Loading