[dynamodb] Dynamodb refactor (#9937)

* [dynamodb] Update to SDKv2 Enhanced Client

In addition, introduce new more simple table layout, having only one
table for all items and with more efficient data encoding (saves some read capacity).

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Time To Live (TTL) support with new table schema

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Support QuantityType

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] suppress null warnings in tests

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Optimized query performance

Similar to https://github.com/openhab/openhab-addons/pull/8938,
avoid calling Item.getUnit() repeatedly when querying data.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Support for Group items

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Update copyright to 2021

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Removing TODO comments and add javadoc

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] javadoc

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Readability improved in TableCreatingPutItem

Also documenting the full retry logic.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] verify fixes

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove slf4j from explicit dependencies

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove jackson from pom.xml, add as feature dep

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd.importpackage tuned

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] abort query() immediately if not configured to avoid NPE

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less chatty diagnostics

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] xml formatting

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] corrected logger class

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] null checks

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] netty client configured

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd not to filter out importpackage org.slf4j.impl

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] cfg bundle group id

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove usage of org.apache.commons

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Remove extra prints from test

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Reducing @SupressWarnings with generics

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] README extra space removed

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] spotless

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Removed unnecessary logging

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] encapsulation

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] removed unnecessary NonNullByDefault({}) ctr-injected field

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] null annotations

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less verbose logging in tests

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Prefer Collections.emptyList over List.of()

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] less verbose call

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Visitor to return values (simplifies the code)

Less warnings suppressed

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] comments for remaining warning supressions

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] README tuning, typo fixing

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Using less verbose syntax

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] simplified logging on errors

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Code review comments

Avoiding null checker while having more compact code

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] Null safety

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] configuration label and description formatting

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] xml indentation with tabs

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] @Nullable 1-line annotation with class fields

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] No need to override credentials per request

Client has the credentials set on build time

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] set API timeouts no matter what

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] adding exception message

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] static logger

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] dependency

- comments clarifying the logic of properties
- adding netty to dep.noembedding to ensure it is not compiled in

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] ensure correct jackson and netty versions using dependencyMgt

Specifically for development and testing

See 051c764789
for further discussion why this is needed.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] avoid google collections

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] jackson-dataformat-cbor not jackson-cbor

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] also restrict netty-transport-native-epoll linux-x86_64 version

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] refering dynamodb.cfg similar to other bundles

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] bnd.importpackage to excl. reactivestreams and typesafe.netty

These are compiled-in dependencies, and thus we do not want to have them in
OSGi Import-Package.

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* Update bundles/org.openhab.persistence.dynamodb/src/main/resources/OH-INF/config/config.xml

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>
Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* Update bundles/org.openhab.persistence.dynamodb/src/main/resources/OH-INF/config/config.xml

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>

* [dynamodb] remove netty-codec-http2 as it is included in tp-netty

See https://github.com/openhab/openhab-core/pull/2257/

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] removed duplicate in bnd.importpackage

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

* [dynamodb] slf4j-api marked as provided to remove dep errors in runtime

Signed-off-by: Sami Salonen <ssalonen@gmail.com>

Co-authored-by: Fabian Wolter <github@fabian-wolter.de>
This commit is contained in:
Sami Salonen
2021-04-10 23:13:38 +03:00
committed by GitHub
parent 08602c04b4
commit b675160486
58 changed files with 4407 additions and 1592 deletions

View File

@@ -5,6 +5,7 @@
<feature name="openhab-persistence-dynamodb" description="DynamoDB Persistence" version="${project.version}">
<feature>openhab-runtime-base</feature>
<feature dependency="true">openhab.tp-jackson</feature>
<feature dependency="true">openhab.tp-netty</feature>
<bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.dynamodb/${project.version}</bundle>
<configfile finalname="${openhab.conf}/services/dynamodb.cfg" override="false">mvn:org.openhab.addons.features.karaf/org.openhab.addons.features.karaf.openhab-addons-external/${project.version}/cfg/dynamodb</configfile>
</feature>

View File

@@ -1,129 +0,0 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.UUID;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.TimeUnit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item;
import org.openhab.core.persistence.PersistenceService;
import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Abstract class for buffered persistence services
*
* @param <T> Type of the state as accepted by the AWS SDK.
*
* @author Sami Salonen - Initial contribution
* @author Kai Kreuzer - Migration to 3.x
*
*/
@NonNullByDefault
public abstract class AbstractBufferedPersistenceService<T> implements PersistenceService {
private static final long BUFFER_OFFER_TIMEOUT_MILLIS = 500;
private final Logger logger = LoggerFactory.getLogger(AbstractBufferedPersistenceService.class);
protected @Nullable BlockingQueue<T> buffer;
private boolean writeImmediately;
protected void resetWithBufferSize(int bufferSize) {
int capacity = Math.max(1, bufferSize);
buffer = new ArrayBlockingQueue<>(capacity, true);
writeImmediately = bufferSize == 0;
}
protected abstract T persistenceItemFromState(String name, State state, ZonedDateTime time);
protected abstract boolean isReadyToStore();
protected abstract void flushBufferedData();
@Override
public void store(Item item) {
store(item, null);
}
@Override
public void store(Item item, @Nullable String alias) {
long storeStart = System.currentTimeMillis();
String uuid = UUID.randomUUID().toString();
if (item.getState() instanceof UnDefType) {
logger.debug("Undefined item state received. Not storing item {}.", item.getName());
return;
}
if (!isReadyToStore()) {
return;
}
if (buffer == null) {
throw new IllegalStateException("Buffer not initialized with resetWithBufferSize. Bug?");
}
ZonedDateTime time = ZonedDateTime.ofInstant(Instant.ofEpochMilli(storeStart), ZoneId.systemDefault());
String realName = item.getName();
String name = (alias != null) ? alias : realName;
State state = item.getState();
T persistenceItem = persistenceItemFromState(name, state, time);
logger.trace("store() called with item {}, which was converted to {} [{}]", item, persistenceItem, uuid);
if (writeImmediately) {
logger.debug("Writing immediately item {} [{}]", realName, uuid);
// We want to write everything immediately
// Synchronous behavior to ensure buffer does not get full.
synchronized (this) {
boolean buffered = addToBuffer(persistenceItem);
assert buffered;
flushBufferedData();
}
} else {
long bufferStart = System.currentTimeMillis();
boolean buffered = addToBuffer(persistenceItem);
if (buffered) {
logger.debug("Buffered item {} in {} ms. Total time for store(): {} [{}]", realName,
System.currentTimeMillis() - bufferStart, System.currentTimeMillis() - storeStart, uuid);
} else {
logger.debug(
"Buffer is full. Writing buffered data immediately and trying again. Consider increasing bufferSize");
// Buffer is full, commit it immediately
flushBufferedData();
boolean buffered2 = addToBuffer(persistenceItem);
if (buffered2) {
logger.debug("Buffered item in {} ms (2nd try, flushed buffer in-between) [{}]",
System.currentTimeMillis() - bufferStart, uuid);
} else {
// The unlikely case happened -- buffer got full again immediately
logger.warn("Buffering failed for the second time -- Too small bufferSize? Discarding data [{}]",
uuid);
}
}
}
}
protected boolean addToBuffer(T persistenceItem) {
try {
return buffer != null && buffer.offer(persistenceItem, BUFFER_OFFER_TIMEOUT_MILLIS, TimeUnit.MILLISECONDS);
} catch (InterruptedException e) {
logger.warn("Interrupted when trying to buffer data! Dropping data");
return false;
}
}
}

View File

@@ -13,6 +13,8 @@
package org.openhab.persistence.dynamodb.internal;
import java.math.BigDecimal;
import java.time.Duration;
import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter;
@@ -20,12 +22,18 @@ import java.time.format.DateTimeParseException;
import java.util.HashMap;
import java.util.Map;
import javax.measure.Quantity;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item;
import org.openhab.core.library.items.CallItem;
import org.openhab.core.library.items.ColorItem;
import org.openhab.core.library.items.ContactItem;
import org.openhab.core.library.items.DateTimeItem;
import org.openhab.core.library.items.DimmerItem;
import org.openhab.core.library.items.ImageItem;
import org.openhab.core.library.items.LocationItem;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.items.PlayerItem;
@@ -40,17 +48,23 @@ import org.openhab.core.library.types.OpenClosedType;
import org.openhab.core.library.types.PercentType;
import org.openhab.core.library.types.PlayPauseType;
import org.openhab.core.library.types.PointType;
import org.openhab.core.library.types.QuantityType;
import org.openhab.core.library.types.RewindFastforwardType;
import org.openhab.core.library.types.StringListType;
import org.openhab.core.library.types.StringType;
import org.openhab.core.library.types.UpDownType;
import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.TableSchema;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticAttributeTags;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema.Builder;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
/**
* Base class for all DynamoDBItem. Represents openHAB Item serialized in a suitable format for the database
@@ -59,33 +73,71 @@ import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverter;
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
private static final BigDecimal REWIND_BIGDECIMAL = new BigDecimal("-1");
private static final BigDecimal PAUSE_BIGDECIMAL = new BigDecimal("0");
private static final BigDecimal PLAY_BIGDECIMAL = new BigDecimal("1");
private static final BigDecimal FAST_FORWARD_BIGDECIMAL = new BigDecimal("2");
private static final ZoneId UTC = ZoneId.of("UTC");
public static final ZonedDateTimeStringConverter ZONED_DATE_TIME_CONVERTER_STRING = new ZonedDateTimeStringConverter();
public static final ZonedDateTimeMilliEpochConverter ZONED_DATE_TIME_CONVERTER_MILLIEPOCH = new ZonedDateTimeMilliEpochConverter();
public static final DateTimeFormatter DATEFORMATTER = DateTimeFormatter.ofPattern(DATE_FORMAT).withZone(UTC);
protected static final Class<@Nullable Long> NULLABLE_LONG = (Class<@Nullable Long>) Long.class;
private static final String UNDEFINED_PLACEHOLDER = "<org.openhab.core.types.UnDefType.UNDEF>";
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP = new HashMap<>();
static {
ITEM_CLASS_MAP.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP.put(DimmerItem.class, DynamoDBBigDecimalItem.class); // inherited from SwitchItem (!)
ITEM_CLASS_MAP.put(ColorItem.class, DynamoDBStringItem.class); // inherited from DimmerItem
ITEM_CLASS_MAP.put(PlayerItem.class, DynamoDBStringItem.class);
public static AttributeConverter<ZonedDateTime> getTimestampConverter(boolean legacy) {
return legacy ? ZONED_DATE_TIME_CONVERTER_STRING : ZONED_DATE_TIME_CONVERTER_MILLIEPOCH;
}
public static final Class<DynamoDBItem<?>> getDynamoItemClass(Class<? extends Item> itemClass)
throws NullPointerException {
@SuppressWarnings("unchecked")
Class<DynamoDBItem<?>> dtoclass = (Class<DynamoDBItem<?>>) ITEM_CLASS_MAP.get(itemClass);
protected static <C extends AbstractDynamoDBItem<?>> Builder<C> getBaseSchemaBuilder(Class<C> clz, boolean legacy) {
return TableSchema.builder(clz).addAttribute(String.class,
a -> a.name(legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME_LEGACY : DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
.getter(AbstractDynamoDBItem::getName).setter(AbstractDynamoDBItem::setName)
.tags(StaticAttributeTags.primaryPartitionKey()))
.addAttribute(ZonedDateTime.class, a -> a
.name(legacy ? DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC_LEGACY : DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC)
.getter(AbstractDynamoDBItem::getTime).setter(AbstractDynamoDBItem::setTime)
.tags(StaticAttributeTags.primarySortKey()).attributeConverter(getTimestampConverter(legacy)));
}
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP_LEGACY = new HashMap<>();
static {
ITEM_CLASS_MAP_LEGACY.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(DimmerItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_LEGACY.put(ColorItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_LEGACY.put(PlayerItem.class, DynamoDBStringItem.class);
}
private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP_NEW = new HashMap<>();
static {
ITEM_CLASS_MAP_NEW.put(CallItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(ContactItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(DateTimeItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(LocationItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(NumberItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(StringItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(DimmerItem.class, DynamoDBBigDecimalItem.class);
ITEM_CLASS_MAP_NEW.put(ColorItem.class, DynamoDBStringItem.class);
ITEM_CLASS_MAP_NEW.put(PlayerItem.class, DynamoDBBigDecimalItem.class); // Different from LEGACY
}
public static final Class<? extends DynamoDBItem<?>> getDynamoItemClass(Class<? extends Item> itemClass,
boolean legacy) throws NullPointerException {
Class<? extends DynamoDBItem<?>> dtoclass = (legacy ? ITEM_CLASS_MAP_LEGACY : ITEM_CLASS_MAP_NEW)
.get(itemClass);
if (dtoclass == null) {
throw new IllegalArgumentException(String.format("Unknown item class %s", itemClass));
}
@@ -101,123 +153,313 @@ public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
* @author Sami Salonen - Initial contribution
*
*/
public static final class ZonedDateTimeConverter implements DynamoDBTypeConverter<String, ZonedDateTime> {
public static final class ZonedDateTimeStringConverter implements AttributeConverter<ZonedDateTime> {
@Override
public String convert(ZonedDateTime time) {
return DATEFORMATTER.format(time.withZoneSameInstant(UTC));
public AttributeValue transformFrom(ZonedDateTime time) {
return AttributeValue.builder().s(toString(time)).build();
}
@Override
public ZonedDateTime unconvert(String serialized) {
public ZonedDateTime transformTo(@NonNullByDefault({}) AttributeValue serialized) {
return transformTo(serialized.s());
}
@Override
public EnhancedType<ZonedDateTime> type() {
return EnhancedType.<ZonedDateTime> of(ZonedDateTime.class);
}
@Override
public AttributeValueType attributeValueType() {
return AttributeValueType.S;
}
public String toString(ZonedDateTime time) {
return DATEFORMATTER.format(time.withZoneSameInstant(UTC));
}
public ZonedDateTime transformTo(String serialized) {
return ZonedDateTime.parse(serialized, DATEFORMATTER);
}
}
private static final ZonedDateTimeConverter zonedDateTimeConverter = new ZonedDateTimeConverter();
/**
* Custom converter for serialization/deserialization of ZonedDateTime.
*
* Serialization: ZonedDateTime is first converted to UTC and then stored as milliepochs
*
* @author Sami Salonen - Initial contribution
*
*/
public static final class ZonedDateTimeMilliEpochConverter implements AttributeConverter<ZonedDateTime> {
@Override
public AttributeValue transformFrom(ZonedDateTime time) {
return AttributeValue.builder().n(toEpochMilliString(time)).build();
}
@Override
public ZonedDateTime transformTo(@NonNullByDefault({}) AttributeValue serialized) {
return transformTo(serialized.n());
}
@Override
public EnhancedType<ZonedDateTime> type() {
return EnhancedType.<ZonedDateTime> of(ZonedDateTime.class);
}
@Override
public AttributeValueType attributeValueType() {
return AttributeValueType.N;
}
public static String toEpochMilliString(ZonedDateTime time) {
return String.valueOf(time.toInstant().toEpochMilli());
}
public static BigDecimal toBigDecimal(ZonedDateTime time) {
return new BigDecimal(toEpochMilliString(time));
}
public ZonedDateTime transformTo(String serialized) {
return transformTo(Long.valueOf(serialized));
}
public ZonedDateTime transformTo(Long epochMillis) {
return Instant.ofEpochMilli(epochMillis).atZone(UTC);
}
}
private final Logger logger = LoggerFactory.getLogger(AbstractDynamoDBItem.class);
protected String name;
protected T state;
protected @Nullable T state;
protected ZonedDateTime time;
private @Nullable Integer expireDays;
private @Nullable Long expiry;
public AbstractDynamoDBItem(String name, T state, ZonedDateTime time) {
public AbstractDynamoDBItem(String name, @Nullable T state, ZonedDateTime time, @Nullable Integer expireDays) {
this.name = name;
this.state = state;
this.time = time;
if (expireDays != null && expireDays <= 0) {
throw new IllegalArgumentException();
}
this.expireDays = expireDays;
this.expiry = expireDays == null ? null : time.toInstant().plus(Duration.ofDays(expireDays)).getEpochSecond();
}
public static DynamoDBItem<?> fromState(String name, State state, ZonedDateTime time) {
if (state instanceof DecimalType && !(state instanceof HSBType)) {
// also covers PercentType which is inherited from DecimalType
return new DynamoDBBigDecimalItem(name, ((DecimalType) state).toBigDecimal(), time);
} else if (state instanceof OnOffType) {
return new DynamoDBBigDecimalItem(name,
((OnOffType) state) == OnOffType.ON ? BigDecimal.ONE : BigDecimal.ZERO, time);
} else if (state instanceof OpenClosedType) {
return new DynamoDBBigDecimalItem(name,
((OpenClosedType) state) == OpenClosedType.OPEN ? BigDecimal.ONE : BigDecimal.ZERO, time);
} else if (state instanceof UpDownType) {
return new DynamoDBBigDecimalItem(name,
((UpDownType) state) == UpDownType.UP ? BigDecimal.ONE : BigDecimal.ZERO, time);
} else if (state instanceof DateTimeType) {
return new DynamoDBStringItem(name,
zonedDateTimeConverter.convert(((DateTimeType) state).getZonedDateTime()), time);
} else if (state instanceof UnDefType) {
return new DynamoDBStringItem(name, UNDEFINED_PLACEHOLDER, time);
} else if (state instanceof StringListType) {
return new DynamoDBStringItem(name, state.toFullString(), time);
/**
* Convert given state to target state.
*
* If conversion fails, IllegalStateException is raised.
* Use this method you do not expect conversion to fail.
*
* @param <T> state type to convert to
* @param state state to convert
* @param clz class of the resulting state
* @return state as type T
* @throws IllegalStateException on failing conversion
*/
private static <T extends State> T convert(State state, Class<T> clz) {
@Nullable
T converted = state.as(clz);
if (converted == null) {
throw new IllegalStateException(String.format("Could not convert %s '%s' into %s",
state.getClass().getSimpleName(), state, clz.getClass().getSimpleName()));
}
return converted;
}
public static DynamoDBItem<?> fromStateLegacy(Item item, ZonedDateTime time) {
String name = item.getName();
State state = item.getState();
if (item instanceof PlayerItem) {
return new DynamoDBStringItem(name, state.toFullString(), time, null);
} else {
// HSBType, PointType, PlayPauseType and StringType
return new DynamoDBStringItem(name, state.toFullString(), time);
// Apart from PlayerItem, the values are serialized to dynamodb number/strings in the same way in legacy
// delegate to fromStateNew
return fromStateNew(item, time, null);
}
}
public static DynamoDBItem<?> fromStateNew(Item item, ZonedDateTime time, @Nullable Integer expireDays) {
String name = item.getName();
State state = item.getState();
if (item instanceof CallItem) {
return new DynamoDBStringItem(name, convert(state, StringListType.class).toFullString(), time, expireDays);
} else if (item instanceof ContactItem) {
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof DateTimeItem) {
return new DynamoDBStringItem(name,
ZONED_DATE_TIME_CONVERTER_STRING.toString(((DateTimeType) state).getZonedDateTime()), time,
expireDays);
} else if (item instanceof ImageItem) {
throw new IllegalArgumentException("Unsupported item " + item.getClass().getSimpleName());
} else if (item instanceof LocationItem) {
return new DynamoDBStringItem(name, state.toFullString(), time, expireDays);
} else if (item instanceof NumberItem) {
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof PlayerItem) {
if (state instanceof PlayPauseType) {
switch ((PlayPauseType) state) {
case PLAY:
return new DynamoDBBigDecimalItem(name, PLAY_BIGDECIMAL, time, expireDays);
case PAUSE:
return new DynamoDBBigDecimalItem(name, PAUSE_BIGDECIMAL, time, expireDays);
default:
throw new IllegalArgumentException("Unexpected enum with PlayPauseType: " + state.toString());
}
} else if (state instanceof RewindFastforwardType) {
switch ((RewindFastforwardType) state) {
case FASTFORWARD:
return new DynamoDBBigDecimalItem(name, FAST_FORWARD_BIGDECIMAL, time, expireDays);
case REWIND:
return new DynamoDBBigDecimalItem(name, REWIND_BIGDECIMAL, time, expireDays);
default:
throw new IllegalArgumentException(
"Unexpected enum with RewindFastforwardType: " + state.toString());
}
} else {
throw new IllegalStateException(
String.format("Unexpected state type %s with PlayerItem", state.getClass().getSimpleName()));
}
} else if (item instanceof RollershutterItem) {
// Normalize UP/DOWN to %
return new DynamoDBBigDecimalItem(name, convert(state, PercentType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof StringItem) {
if (state instanceof StringType) {
return new DynamoDBStringItem(name, ((StringType) state).toString(), time, expireDays);
} else if (state instanceof DateTimeType) {
return new DynamoDBStringItem(name,
ZONED_DATE_TIME_CONVERTER_STRING.toString(((DateTimeType) state).getZonedDateTime()), time,
expireDays);
} else {
throw new IllegalStateException(
String.format("Unexpected state type %s with StringItem", state.getClass().getSimpleName()));
}
} else if (item instanceof ColorItem) { // Note: needs to be before parent class DimmerItem
return new DynamoDBStringItem(name, convert(state, HSBType.class).toFullString(), time, expireDays);
} else if (item instanceof DimmerItem) {// Note: needs to be before parent class SwitchItem
// Normalize ON/OFF to %
return new DynamoDBBigDecimalItem(name, convert(state, PercentType.class).toBigDecimal(), time, expireDays);
} else if (item instanceof SwitchItem) {
// Normalize ON/OFF to 1/0
return new DynamoDBBigDecimalItem(name, convert(state, DecimalType.class).toBigDecimal(), time, expireDays);
} else {
throw new IllegalArgumentException("Unsupported item " + item.getClass().getSimpleName());
}
}
@Override
public HistoricItem asHistoricItem(final Item item) {
final State[] state = new State[1];
accept(new DynamoDBItemVisitor() {
public @Nullable HistoricItem asHistoricItem(final Item item) {
return asHistoricItem(item, null);
}
@Override
public void visit(DynamoDBStringItem dynamoStringItem) {
if (item instanceof ColorItem) {
state[0] = new HSBType(dynamoStringItem.getState());
} else if (item instanceof LocationItem) {
state[0] = new PointType(dynamoStringItem.getState());
} else if (item instanceof PlayerItem) {
String value = dynamoStringItem.getState();
try {
state[0] = PlayPauseType.valueOf(value);
} catch (IllegalArgumentException e) {
state[0] = RewindFastforwardType.valueOf(value);
}
} else if (item instanceof DateTimeItem) {
try {
// Parse ZoneDateTime from string. DATEFORMATTER assumes UTC in case it is not clear
// from the string (should be).
// We convert to default/local timezone for user convenience (e.g. display)
state[0] = new DateTimeType(zonedDateTimeConverter.unconvert(dynamoStringItem.getState())
.withZoneSameInstant(ZoneId.systemDefault()));
} catch (DateTimeParseException e) {
logger.warn("Failed to parse {} as date. Outputting UNDEF instead",
dynamoStringItem.getState());
state[0] = UnDefType.UNDEF;
}
} else if (dynamoStringItem.getState().equals(UNDEFINED_PLACEHOLDER)) {
state[0] = UnDefType.UNDEF;
} else if (item instanceof CallItem) {
String parts = dynamoStringItem.getState();
String[] strings = parts.split(",");
String orig = strings[0];
String dest = strings[1];
state[0] = new StringListType(orig, dest);
} else {
state[0] = new StringType(dynamoStringItem.getState());
}
}
@Override
public @Nullable HistoricItem asHistoricItem(final Item item, @Nullable Unit<?> targetUnit) {
final State deserializedState;
if (this.getState() == null) {
return null;
}
try {
deserializedState = accept(new DynamoDBItemVisitor<@Nullable State>() {
@Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
if (item instanceof NumberItem) {
state[0] = new DecimalType(dynamoBigDecimalItem.getState());
} else if (item instanceof DimmerItem) {
state[0] = new PercentType(dynamoBigDecimalItem.getState());
} else if (item instanceof SwitchItem) {
state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OnOffType.ON
: OnOffType.OFF;
} else if (item instanceof ContactItem) {
state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OpenClosedType.OPEN
: OpenClosedType.CLOSED;
} else if (item instanceof RollershutterItem) {
state[0] = new PercentType(dynamoBigDecimalItem.getState());
} else {
logger.warn("Not sure how to convert big decimal item {} to type {}. Using StringType as fallback",
dynamoBigDecimalItem.getName(), item.getClass());
state[0] = new StringType(dynamoBigDecimalItem.getState().toString());
@Override
public @Nullable State visit(DynamoDBStringItem dynamoStringItem) {
String stringState = dynamoStringItem.getState();
if (stringState == null) {
return null;
}
if (item instanceof ColorItem) {
return new HSBType(stringState);
} else if (item instanceof LocationItem) {
return new PointType(stringState);
} else if (item instanceof PlayerItem) {
// Backwards-compatibility with legacy schema. New schema uses DynamoDBBigDecimalItem
try {
return PlayPauseType.valueOf(stringState);
} catch (IllegalArgumentException e) {
return RewindFastforwardType.valueOf(stringState);
}
} else if (item instanceof DateTimeItem) {
try {
// Parse ZoneDateTime from string. DATEFORMATTER assumes UTC in case it is not clear
// from the string (should be).
// We convert to default/local timezone for user convenience (e.g. display)
return new DateTimeType(ZONED_DATE_TIME_CONVERTER_STRING.transformTo(stringState)
.withZoneSameInstant(ZoneId.systemDefault()));
} catch (DateTimeParseException e) {
logger.warn("Failed to parse {} as date. Outputting UNDEF instead", stringState);
return UnDefType.UNDEF;
}
} else if (item instanceof CallItem) {
String parts = stringState;
String[] strings = parts.split(",");
String orig = strings[0];
String dest = strings[1];
return new StringListType(orig, dest);
} else {
return new StringType(dynamoStringItem.getState());
}
}
@Override
public @Nullable State visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
BigDecimal numberState = dynamoBigDecimalItem.getState();
if (numberState == null) {
return null;
}
if (item instanceof NumberItem) {
NumberItem numberItem = ((NumberItem) item);
Unit<? extends Quantity<?>> unit = targetUnit == null ? numberItem.getUnit() : targetUnit;
if (unit != null) {
return new QuantityType<>(numberState, unit);
} else {
return new DecimalType(numberState);
}
} else if (item instanceof DimmerItem) {
// % values have been stored as-is
return new PercentType(numberState);
} else if (item instanceof SwitchItem) {
return numberState.compareTo(BigDecimal.ZERO) != 0 ? OnOffType.ON : OnOffType.OFF;
} else if (item instanceof ContactItem) {
return numberState.compareTo(BigDecimal.ZERO) != 0 ? OpenClosedType.OPEN
: OpenClosedType.CLOSED;
} else if (item instanceof RollershutterItem) {
// Percents and UP/DOWN have been stored % values (not fractional)
return new PercentType(numberState);
} else if (item instanceof PlayerItem) {
if (numberState.equals(PLAY_BIGDECIMAL)) {
return PlayPauseType.PLAY;
} else if (numberState.equals(PAUSE_BIGDECIMAL)) {
return PlayPauseType.PAUSE;
} else if (numberState.equals(FAST_FORWARD_BIGDECIMAL)) {
return RewindFastforwardType.FASTFORWARD;
} else if (numberState.equals(REWIND_BIGDECIMAL)) {
return RewindFastforwardType.REWIND;
} else {
throw new IllegalArgumentException("Unknown serialized value");
}
} else {
logger.warn(
"Not sure how to convert big decimal item {} to type {}. Using StringType as fallback",
dynamoBigDecimalItem.getName(), item.getClass());
return new StringType(numberState.toString());
}
}
});
if (deserializedState == null) {
return null;
}
});
return new DynamoDBHistoricItem(getName(), state[0], getTime());
return new DynamoDBHistoricItem(getName(), deserializedState, getTime());
} catch (Exception e) {
logger.trace("Failed to convert state '{}' to item {} {}: {} {}. Data persisted with incompatible item.",
this.state, item.getClass().getSimpleName(), item.getName(), e.getClass().getSimpleName(),
e.getMessage());
return null;
}
}
/**
@@ -232,10 +474,53 @@ public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
* DynamoItemVisitor)
*/
@Override
public abstract void accept(DynamoDBItemVisitor visitor);
public abstract <R> R accept(DynamoDBItemVisitor<R> visitor);
@Override
public String toString() {
return DATEFORMATTER.format(time) + ": " + name + " -> " + state.toString();
@Nullable
T localState = state;
return DATEFORMATTER.format(time) + ": " + name + " -> "
+ (localState == null ? "<null>" : localState.toString());
}
@Override
public String getName() {
return name;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public ZonedDateTime getTime() {
return time;
}
@Override
@Nullable
public Long getExpiryDate() {
return expiry;
}
@Override
public void setTime(ZonedDateTime time) {
this.time = time;
}
@Override
public @Nullable Integer getExpireDays() {
return expireDays;
}
@Override
public void setExpireDays(@Nullable Integer expireDays) {
this.expireDays = expireDays;
}
public void setExpiry(@Nullable Long expiry) {
this.expiry = expiry;
}
}

View File

@@ -16,20 +16,37 @@ import java.math.BigDecimal;
import java.math.MathContext;
import java.time.ZonedDateTime;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverted;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema;
/**
* DynamoDBItem for items that can be serialized as DynamoDB number
*
* @author Sami Salonen - Initial contribution
*/
@DynamoDBDocument
@NonNullByDefault
public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> {
private static Class<@Nullable BigDecimal> NULLABLE_BIGDECIMAL = (Class<@Nullable BigDecimal>) BigDecimal.class;
public static StaticTableSchema<DynamoDBBigDecimalItem> TABLE_SCHEMA_LEGACY = getBaseSchemaBuilder(
DynamoDBBigDecimalItem.class, true).newItemSupplier(DynamoDBBigDecimalItem::new)
.addAttribute(NULLABLE_BIGDECIMAL, a -> a.name(ATTRIBUTE_NAME_ITEMSTATE_LEGACY)
.getter(DynamoDBBigDecimalItem::getState).setter(DynamoDBBigDecimalItem::setState))
.build();
public static StaticTableSchema<DynamoDBBigDecimalItem> TABLE_SCHEMA_NEW = getBaseSchemaBuilder(
DynamoDBBigDecimalItem.class, false)
.newItemSupplier(DynamoDBBigDecimalItem::new)
.addAttribute(NULLABLE_BIGDECIMAL,
a -> a.name(ATTRIBUTE_NAME_ITEMSTATE_NUMBER).getter(DynamoDBBigDecimalItem::getState)
.setter(DynamoDBBigDecimalItem::setState))
.addAttribute(NULLABLE_LONG, a -> a.name(ATTRIBUTE_NAME_EXPIRY)
.getter(AbstractDynamoDBItem::getExpiryDate).setter(AbstractDynamoDBItem::setExpiry))
.build();
/**
* We get the following error if the BigDecimal has too many digits
* "Attempting to store more than 38 significant digits in a Number"
@@ -40,58 +57,36 @@ public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> {
private static final int MAX_DIGITS_SUPPORTED_BY_AMAZON = 38;
public DynamoDBBigDecimalItem() {
this(null, null, null);
this("", null, ZonedDateTime.now(), null);
}
public DynamoDBBigDecimalItem(String name, BigDecimal state, ZonedDateTime time) {
super(name, state, time);
public DynamoDBBigDecimalItem(String name, @Nullable BigDecimal state, ZonedDateTime time,
@Nullable Integer expireDays) {
super(name, state, time, expireDays);
}
@DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
@Override
public BigDecimal getState() {
public @Nullable BigDecimal getState() {
// When serializing this to the wire, we round the number in order to ensure
// that it is within the dynamodb limits
return loseDigits(state);
}
@DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
@Override
public String getName() {
return name;
BigDecimal localState = state;
if (localState == null) {
return null;
}
return loseDigits(localState);
}
@Override
@DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
@DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public ZonedDateTime getTime() {
return time;
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public void setState(BigDecimal state) {
public void setState(@Nullable BigDecimal state) {
this.state = state;
}
@Override
public void setTime(ZonedDateTime time) {
this.time = time;
}
@Override
public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
visitor.visit(this);
public <T> T accept(DynamoDBItemVisitor<T> visitor) {
return visitor.visit(this);
}
static BigDecimal loseDigits(BigDecimal number) {
if (number == null) {
return null;
}
return number.round(new MathContext(MAX_DIGITS_SUPPORTED_BY_AMAZON));
}
}

View File

@@ -1,66 +0,0 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.document.DynamoDB;
/**
* Shallow wrapper for Dynamo DB wrappers
*
* @author Sami Salonen - Initial contribution
*/
public class DynamoDBClient {
private final Logger logger = LoggerFactory.getLogger(DynamoDBClient.class);
private DynamoDB dynamo;
private AmazonDynamoDB client;
public DynamoDBClient(AWSCredentials credentials, Regions region) {
client = AmazonDynamoDBClientBuilder.standard().withRegion(region)
.withCredentials(new AWSStaticCredentialsProvider(credentials)).build();
dynamo = new DynamoDB(client);
}
public DynamoDBClient(DynamoDBConfig clientConfig) {
this(clientConfig.getCredentials(), clientConfig.getRegion());
}
public AmazonDynamoDB getDynamoClient() {
return client;
}
public DynamoDB getDynamoDB() {
return dynamo;
}
public void shutdown() {
dynamo.shutdown();
}
public boolean checkConnection() {
try {
dynamo.listTables(1).firstPage();
} catch (Exception e) {
logger.warn("Got internal server error when trying to list tables: {}", e.getMessage());
return false;
}
return true;
}
}

View File

@@ -12,8 +12,9 @@
*/
package org.openhab.persistence.dynamodb.internal;
import java.util.Arrays;
import java.nio.file.Path;
import java.util.Map;
import java.util.Optional;
import java.util.stream.Collectors;
import org.eclipse.jdt.annotation.NonNullByDefault;
@@ -21,35 +22,45 @@ import org.eclipse.jdt.annotation.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.profile.ProfilesConfigFile;
import com.amazonaws.regions.Regions;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.AwsCredentials;
import software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider;
import software.amazon.awssdk.awscore.retry.AwsRetryPolicy;
import software.amazon.awssdk.core.retry.RetryMode;
import software.amazon.awssdk.core.retry.RetryPolicy;
import software.amazon.awssdk.profiles.ProfileFile;
import software.amazon.awssdk.profiles.ProfileFile.Type;
import software.amazon.awssdk.profiles.ProfileProperty;
import software.amazon.awssdk.regions.Region;
/**
* Configuration for DynamoDB connections
*
* If table parameter is specified and is not blank, we use new table schema (ExpectedTableRevision.NEW).
* If tablePrefix parameter is specified and is not blank, we use legacy table schema (ExpectedTableRevision.LEGACY).
* Other cases conservatively set ExpectedTableRevision.MAYBE_LEGACY, detecting the right schema during runtime.
*
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public class DynamoDBConfig {
public static final String DEFAULT_TABLE_PREFIX = "openhab-";
public static final boolean DEFAULT_CREATE_TABLE_ON_DEMAND = true;
public static final String DEFAULT_TABLE_NAME = "openhab";
public static final long DEFAULT_READ_CAPACITY_UNITS = 1;
public static final long DEFAULT_WRITE_CAPACITY_UNITS = 1;
public static final long DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS = 1000;
public static final int DEFAULT_BUFFER_SIZE = 1000;
public static final RetryMode DEFAULT_RETRY_MODE = RetryMode.STANDARD;
private static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBConfig.class);
private String tablePrefix = DEFAULT_TABLE_PREFIX;
private Regions region;
private AWSCredentials credentials;
private boolean createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
private long readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
private long writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
private long bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
private int bufferSize = DEFAULT_BUFFER_SIZE;
private long readCapacityUnits;
private long writeCapacityUnits;
private Region region;
private AwsCredentials credentials;
private RetryPolicy retryPolicy;
private ExpectedTableSchema tableRevision;
private String table;
private String tablePrefixLegacy;
private @Nullable Integer expireDays;
/**
*
@@ -57,26 +68,26 @@ public class DynamoDBConfig {
* @return DynamoDB configuration. Returns null in case of configuration errors
*/
public static @Nullable DynamoDBConfig fromConfig(Map<String, Object> config) {
ExpectedTableSchema tableRevision;
try {
String regionName = (String) config.get("region");
if (regionName == null) {
return null;
}
final Regions region;
try {
region = Regions.fromName(regionName);
} catch (IllegalArgumentException e) {
LOGGER.error("Specify valid AWS region to use, got {}. Valid values include: {}", regionName, Arrays
.asList(Regions.values()).stream().map(r -> r.getName()).collect(Collectors.joining(",")));
return null;
final Region region;
if (Region.regions().stream().noneMatch(r -> r.toString().equals(regionName))) {
LOGGER.warn("Region {} is not matching known regions: {}. The region might not be supported.",
regionName, Region.regions().stream().map(r -> r.toString()).collect(Collectors.joining(", ")));
}
region = Region.of(regionName);
AWSCredentials credentials;
RetryMode retryMode = RetryMode.STANDARD;
AwsCredentials credentials;
String accessKey = (String) config.get("accessKey");
String secretKey = (String) config.get("secretKey");
if (accessKey != null && !accessKey.isBlank() && secretKey != null && !secretKey.isBlank()) {
LOGGER.debug("accessKey and secretKey specified. Using those.");
credentials = new BasicAWSCredentials(accessKey, secretKey);
credentials = AwsBasicCredentials.create(accessKey, secretKey);
} else {
LOGGER.debug("accessKey and/or secretKey blank. Checking profilesConfigFile and profile.");
String profilesConfigFile = (String) config.get("profilesConfigFile");
@@ -87,28 +98,49 @@ public class DynamoDBConfig {
+ "profile for providing AWS credentials");
return null;
}
credentials = new ProfilesConfigFile(profilesConfigFile).getCredentials(profile);
ProfileFile profileFile = ProfileFile.builder().content(Path.of(profilesConfigFile))
.type(Type.CREDENTIALS).build();
credentials = ProfileCredentialsProvider.builder().profileFile(profileFile).profileName(profile).build()
.resolveCredentials();
retryMode = profileFile.profile(profile).flatMap(p -> p.property(ProfileProperty.RETRY_MODE))
.flatMap(retry_mode -> {
for (RetryMode value : RetryMode.values()) {
if (retry_mode.equalsIgnoreCase(value.name())) {
return Optional.of(value);
}
}
LOGGER.warn("Unknown retry_mode '{}' in profile. Ignoring and using default {} retry mode.",
retry_mode, DEFAULT_RETRY_MODE);
return Optional.empty();
}).orElse(DEFAULT_RETRY_MODE);
LOGGER.debug("Retry mode {}", retryMode);
}
String table = (String) config.get("tablePrefix");
String table = (String) config.get("table");
String tablePrefixLegacy;
if (table == null || table.isBlank()) {
LOGGER.debug("Using default table name {}", DEFAULT_TABLE_PREFIX);
table = DEFAULT_TABLE_PREFIX;
}
final boolean createTable;
String createTableParam = (String) config.get("createTable");
if (createTableParam == null || createTableParam.isBlank()) {
LOGGER.debug("Creating table on demand: {}", DEFAULT_CREATE_TABLE_ON_DEMAND);
createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
// the new parameter 'table' has not been set. Check whether the legacy parameter 'tablePrefix' is set
table = DEFAULT_TABLE_NAME;
tablePrefixLegacy = (String) config.get("tablePrefix");
if (tablePrefixLegacy == null || tablePrefixLegacy.isBlank()) {
LOGGER.debug("Using default table prefix {}", DEFAULT_TABLE_PREFIX);
// No explicit value has been specified for tablePrefix, user could be still using the legacy setup
tableRevision = ExpectedTableSchema.MAYBE_LEGACY;
tablePrefixLegacy = DEFAULT_TABLE_PREFIX;
} else {
// Explicit value for tablePrefix, user certainly prefers LEGACY
tableRevision = ExpectedTableSchema.LEGACY;
}
} else {
createTable = Boolean.parseBoolean(createTableParam);
tableRevision = ExpectedTableSchema.NEW;
tablePrefixLegacy = DEFAULT_TABLE_PREFIX;
}
final long readCapacityUnits;
String readCapacityUnitsParam = (String) config.get("readCapacityUnits");
if (readCapacityUnitsParam == null || readCapacityUnitsParam.isBlank()) {
LOGGER.debug("Read capacity units: {}", DEFAULT_READ_CAPACITY_UNITS);
readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
} else {
readCapacityUnits = Long.parseLong(readCapacityUnitsParam);
@@ -117,66 +149,100 @@ public class DynamoDBConfig {
final long writeCapacityUnits;
String writeCapacityUnitsParam = (String) config.get("writeCapacityUnits");
if (writeCapacityUnitsParam == null || writeCapacityUnitsParam.isBlank()) {
LOGGER.debug("Write capacity units: {}", DEFAULT_WRITE_CAPACITY_UNITS);
writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
} else {
writeCapacityUnits = Long.parseLong(writeCapacityUnitsParam);
}
final long bufferCommitIntervalMillis;
String bufferCommitIntervalMillisParam = (String) config.get("bufferCommitIntervalMillis");
if (bufferCommitIntervalMillisParam == null || bufferCommitIntervalMillisParam.isBlank()) {
LOGGER.debug("Buffer commit interval millis: {}", DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS);
bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
final @Nullable Integer expireDays;
String expireDaysString = (String) config.get("expireDays");
if (expireDaysString == null || expireDaysString.isBlank()) {
expireDays = null;
} else {
bufferCommitIntervalMillis = Long.parseLong(bufferCommitIntervalMillisParam);
expireDays = Integer.parseInt(expireDaysString);
if (expireDays <= 0) {
LOGGER.error("expireDays should be positive integer or null");
return null;
}
}
final int bufferSize;
String bufferSizeParam = (String) config.get("bufferSize");
if (bufferSizeParam == null || bufferSizeParam.isBlank()) {
LOGGER.debug("Buffer size: {}", DEFAULT_BUFFER_SIZE);
bufferSize = DEFAULT_BUFFER_SIZE;
} else {
bufferSize = Integer.parseInt(bufferSizeParam);
switch (tableRevision) {
case NEW:
LOGGER.debug("Using new DynamoDB table schema");
return DynamoDBConfig.newSchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode), table,
readCapacityUnits, writeCapacityUnits, expireDays);
case LEGACY:
LOGGER.warn(
"Using legacy DynamoDB table schema. It is recommended to transition to new schema by defining 'table' parameter and not configuring 'tablePrefix'");
return DynamoDBConfig.legacySchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode),
tablePrefixLegacy, readCapacityUnits, writeCapacityUnits);
case MAYBE_LEGACY:
LOGGER.debug(
"Unclear whether we should use new legacy DynamoDB table schema. It is recommended to explicitly define new 'table' parameter. The correct table schema will be detected at runtime.");
return DynamoDBConfig.maybeLegacySchema(region, credentials, AwsRetryPolicy.forRetryMode(retryMode),
table, tablePrefixLegacy, readCapacityUnits, writeCapacityUnits, expireDays);
default:
throw new IllegalStateException("Unhandled enum. Bug");
}
return new DynamoDBConfig(region, credentials, table, createTable, readCapacityUnits, writeCapacityUnits,
bufferCommitIntervalMillis, bufferSize);
} catch (Exception e) {
LOGGER.error("Error with configuration", e);
LOGGER.error("Error with configuration: {} {}", e.getClass().getSimpleName(), e.getMessage());
return null;
}
}
public DynamoDBConfig(Regions region, AWSCredentials credentials, String table, boolean createTable,
long readCapacityUnits, long writeCapacityUnits, long bufferCommitIntervalMillis, int bufferSize) {
this.region = region;
this.credentials = credentials;
this.tablePrefix = table;
this.createTable = createTable;
this.readCapacityUnits = readCapacityUnits;
this.writeCapacityUnits = writeCapacityUnits;
this.bufferCommitIntervalMillis = bufferCommitIntervalMillis;
this.bufferSize = bufferSize;
private static DynamoDBConfig newSchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
String table, long readCapacityUnits, long writeCapacityUnits, @Nullable Integer expireDays) {
return new DynamoDBConfig(region, credentials, retryPolicy, table, "", ExpectedTableSchema.NEW,
readCapacityUnits, writeCapacityUnits, expireDays);
}
public AWSCredentials getCredentials() {
private static DynamoDBConfig legacySchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
String tablePrefixLegacy, long readCapacityUnits, long writeCapacityUnits) {
return new DynamoDBConfig(region, credentials, retryPolicy, "", tablePrefixLegacy, ExpectedTableSchema.LEGACY,
readCapacityUnits, writeCapacityUnits, null);
}
private static DynamoDBConfig maybeLegacySchema(Region region, AwsCredentials credentials, RetryPolicy retryPolicy,
String table, String tablePrefixLegacy, long readCapacityUnits, long writeCapacityUnits,
@Nullable Integer expireDays) {
return new DynamoDBConfig(region, credentials, retryPolicy, table, tablePrefixLegacy,
ExpectedTableSchema.MAYBE_LEGACY, readCapacityUnits, writeCapacityUnits, expireDays);
}
private DynamoDBConfig(Region region, AwsCredentials credentials, RetryPolicy retryPolicy, String table,
String tablePrefixLegacy, ExpectedTableSchema tableRevision, long readCapacityUnits,
long writeCapacityUnits, @Nullable Integer expireDays) {
this.region = region;
this.credentials = credentials;
this.retryPolicy = retryPolicy;
this.table = table;
this.tablePrefixLegacy = tablePrefixLegacy;
this.tableRevision = tableRevision;
this.readCapacityUnits = readCapacityUnits;
this.writeCapacityUnits = writeCapacityUnits;
this.expireDays = expireDays;
}
public AwsCredentials getCredentials() {
return credentials;
}
public String getTablePrefix() {
return tablePrefix;
public String getTablePrefixLegacy() {
return tablePrefixLegacy;
}
public Regions getRegion() {
public String getTable() {
return table;
}
public ExpectedTableSchema getTableRevision() {
return tableRevision;
}
public Region getRegion() {
return region;
}
public boolean isCreateTable() {
return createTable;
}
public long getReadCapacityUnits() {
return readCapacityUnits;
}
@@ -185,11 +251,11 @@ public class DynamoDBConfig {
return writeCapacityUnits;
}
public long getBufferCommitIntervalMillis() {
return bufferCommitIntervalMillis;
public RetryPolicy getRetryPolicy() {
return retryPolicy;
}
public int getBufferSize() {
return bufferSize;
public @Nullable Integer getExpireDays() {
return expireDays;
}
}

View File

@@ -14,6 +14,10 @@ package org.openhab.persistence.dynamodb.internal;
import java.time.ZonedDateTime;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.Item;
import org.openhab.core.persistence.HistoricItem;
@@ -24,35 +28,127 @@ import org.openhab.core.persistence.HistoricItem;
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public interface DynamoDBItem<T> {
static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
static final String ATTRIBUTE_NAME_TIMEUTC = "timeutc";
static final String ATTRIBUTE_NAME_ITEMNAME = "itemname";
static final String ATTRIBUTE_NAME_ITEMSTATE = "itemstate";
static final String ATTRIBUTE_NAME_TIMEUTC_LEGACY = "timeutc";
static final String ATTRIBUTE_NAME_ITEMNAME_LEGACY = "itemname";
static final String ATTRIBUTE_NAME_ITEMSTATE_LEGACY = "itemstate";
static final String ATTRIBUTE_NAME_TIMEUTC = "t";
static final String ATTRIBUTE_NAME_ITEMNAME = "i";
static final String ATTRIBUTE_NAME_ITEMSTATE_STRING = "s";
static final String ATTRIBUTE_NAME_ITEMSTATE_NUMBER = "n";
static final String ATTRIBUTE_NAME_EXPIRY = "exp";
/**
* Convert this AbstractDynamoItem as HistoricItem.
* Convert this AbstractDynamoItem as HistoricItem, i.e. converting serialized state back to openHAB state.
*
* Returns null when this instance has null state.
*
* If item is NumberItem and has an unit, the data is converted to QuantityType with item.getUnit().
*
* @param item Item representing this item. Used to determine item type.
* @return HistoricItem representing this DynamoDBItem.
*/
@Nullable
HistoricItem asHistoricItem(Item item);
/**
* Convert this AbstractDynamoItem as HistoricItem.
*
* Returns null when this instance has null state.
* The implementation can deal with legacy schema as well.
*
* Use this method when repeated calls are expected for same item (avoids the expensive call to item.getUnit())
*
* @param item Item representing this item. Used to determine item type.
* @param targetUnit unit to convert the data if item is with Dimension. Has only effect with NumberItems and with
* numeric DynamoDBItems.
* @return HistoricItem representing this DynamoDBItem.
*/
@Nullable
HistoricItem asHistoricItem(Item item, @Nullable Unit<?> targetUnit);
/**
* Get item name
*
* @return item name
*/
String getName();
/**
* Get item state, in the serialized format
*
* @return item state as serialized format
*/
@Nullable
T getState();
/**
* Get timestamp of this value
*
* @return timestamp
*/
ZonedDateTime getTime();
/**
* Get expire time for the DynamoDB item in days.
*
* Does not have any effect with legacy schema.
*
* Also known as time-to-live or TTL.
* Null means that expire is disabled
*
* @return expire time in days
*/
@Nullable
Integer getExpireDays();
/**
* Get expiry date for the DynamoDB item in epoch seconds
*
* This is used with DynamoDB Time to Live TTL feature.
*
* @return expiry date of the data. Equivalent to getTime() + getExpireDays() or null when expireDays is null.
*/
@Nullable
Long getExpiryDate();
/**
* Setter for item name
*
* @param name item name
*/
void setName(String name);
void setState(T state);
/**
* Setter for serialized state
*
* @param state serialized state
*/
void setState(@Nullable T state);
/**
* Set timestamp of the data
*
* @param time timestamp
*/
void setTime(ZonedDateTime time);
void accept(DynamoDBItemVisitor visitor);
/**
* Set expire time for the DynamoDB item in days.
*
* Does not have any effect with legacy schema.
*
* Also known as time-to-live or TTL.
* Use null to disable expiration
*
* @param expireDays expire time in days. Should be positive or null.
*
*/
void setExpireDays(@Nullable Integer expireDays);
<R> R accept(DynamoDBItemVisitor<R> visitor);
}

View File

@@ -21,9 +21,9 @@ import org.eclipse.jdt.annotation.NonNullByDefault;
*
*/
@NonNullByDefault
public interface DynamoDBItemVisitor {
public interface DynamoDBItemVisitor<T> {
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem);
public T visit(DynamoDBBigDecimalItem dynamoBigDecimalItem);
public void visit(DynamoDBStringItem dynamoStringItem);
public T visit(DynamoDBStringItem dynamoStringItem);
}

View File

@@ -12,30 +12,37 @@
*/
package org.openhab.persistence.dynamodb.internal;
import java.lang.reflect.InvocationTargetException;
import java.net.URI;
import java.time.Duration;
import java.time.Instant;
import java.time.ZonedDateTime;
import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Deque;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.stream.Collectors;
import javax.measure.Unit;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.common.NamedThreadFactory;
import org.openhab.core.common.ThreadPoolManager;
import org.openhab.core.config.core.ConfigurableService;
import org.openhab.core.items.GenericItem;
import org.openhab.core.items.GroupItem;
import org.openhab.core.items.Item;
import org.openhab.core.items.ItemNotFoundException;
import org.openhab.core.items.ItemRegistry;
import org.openhab.core.library.items.NumberItem;
import org.openhab.core.library.types.QuantityType;
import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.HistoricItem;
import org.openhab.core.persistence.PersistenceItemInfo;
@@ -43,31 +50,31 @@ import org.openhab.core.persistence.PersistenceService;
import org.openhab.core.persistence.QueryablePersistenceService;
import org.openhab.core.persistence.strategy.PersistenceStrategy;
import org.openhab.core.types.State;
import org.openhab.core.types.UnDefType;
import org.osgi.framework.BundleContext;
import org.osgi.framework.Constants;
import org.osgi.service.component.annotations.Activate;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Deactivate;
import org.osgi.service.component.annotations.Reference;
import org.reactivestreams.Subscriber;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.FailedBatch;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.PaginationLoadingStrategy;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
import com.amazonaws.services.dynamodbv2.datamodeling.PaginatedQueryList;
import com.amazonaws.services.dynamodbv2.document.BatchWriteItemOutcome;
import com.amazonaws.services.dynamodbv2.model.CreateTableRequest;
import com.amazonaws.services.dynamodbv2.model.GlobalSecondaryIndex;
import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput;
import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
import com.amazonaws.services.dynamodbv2.model.TableDescription;
import com.amazonaws.services.dynamodbv2.model.TableStatus;
import com.amazonaws.services.dynamodbv2.model.WriteRequest;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.awscore.AwsRequestOverrideConfiguration;
import software.amazon.awssdk.core.async.SdkPublisher;
import software.amazon.awssdk.core.client.config.ClientAsyncConfiguration;
import software.amazon.awssdk.core.client.config.ClientOverrideConfiguration;
import software.amazon.awssdk.core.client.config.SdkAdvancedAsyncClientOption;
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbAsyncTable;
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbEnhancedAsyncClient;
import software.amazon.awssdk.enhanced.dynamodb.TableSchema;
import software.amazon.awssdk.enhanced.dynamodb.model.QueryEnhancedRequest;
import software.amazon.awssdk.http.nio.netty.NettyNioAsyncHttpClient;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClientBuilder;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
/**
* This is the implementation of the DynamoDB {@link PersistenceService}. It persists item values
@@ -87,92 +94,40 @@ import com.amazonaws.services.dynamodbv2.model.WriteRequest;
QueryablePersistenceService.class }, configurationPid = "org.openhab.dynamodb", //
property = Constants.SERVICE_PID + "=org.openhab.dynamodb")
@ConfigurableService(category = "persistence", label = "DynamoDB Persistence Service", description_uri = DynamoDBPersistenceService.CONFIG_URI)
public class DynamoDBPersistenceService extends AbstractBufferedPersistenceService<DynamoDBItem<?>>
implements QueryablePersistenceService {
public class DynamoDBPersistenceService implements QueryablePersistenceService {
private static final int MAX_CONCURRENCY = 100;
protected static final String CONFIG_URI = "persistence:dynamodb";
private class ExponentialBackoffRetry implements Runnable {
private int retry;
private Map<String, List<WriteRequest>> unprocessedItems;
private @Nullable Exception lastException;
public ExponentialBackoffRetry(Map<String, List<WriteRequest>> unprocessedItems) {
this.unprocessedItems = unprocessedItems;
}
@Override
public void run() {
logger.debug("Error storing object to dynamo, unprocessed items: {}. Retrying with exponential back-off",
unprocessedItems);
lastException = null;
while (!unprocessedItems.isEmpty() && retry < WAIT_MILLIS_IN_RETRIES.length) {
if (!sleep()) {
// Interrupted
return;
}
retry++;
try {
BatchWriteItemOutcome outcome = DynamoDBPersistenceService.this.db.getDynamoDB()
.batchWriteItemUnprocessed(unprocessedItems);
unprocessedItems = outcome.getUnprocessedItems();
lastException = null;
} catch (AmazonServiceException e) {
if (e instanceof ResourceNotFoundException) {
logger.debug(
"DynamoDB query raised unexpected exception: {}. This might happen if table was recently created",
e.getMessage());
} else {
logger.debug("DynamoDB query raised unexpected exception: {}.", e.getMessage());
}
lastException = e;
continue;
}
}
if (unprocessedItems.isEmpty()) {
logger.debug("After {} retries successfully wrote all unprocessed items", retry);
} else {
logger.warn(
"Even after retries failed to write some items. Last exception: {} {}, unprocessed items: {}",
lastException == null ? "null" : lastException.getClass().getName(),
lastException == null ? "null" : lastException.getMessage(), unprocessedItems);
}
}
private boolean sleep() {
try {
long sleepTime;
if (retry == 1 && lastException != null && lastException instanceof ResourceNotFoundException) {
sleepTime = WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS;
} else {
sleepTime = WAIT_MILLIS_IN_RETRIES[retry];
}
Thread.sleep(sleepTime);
return true;
} catch (InterruptedException e) {
logger.debug("Interrupted while writing data!");
return false;
}
}
public Map<String, List<WriteRequest>> getUnprocessedItems() {
return unprocessedItems;
}
}
private static final int WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS = 5000;
private static final int[] WAIT_MILLIS_IN_RETRIES = new int[] { 100, 100, 200, 300, 500 };
private static final String DYNAMODB_THREADPOOL_NAME = "dynamodbPersistenceService";
private final ItemRegistry itemRegistry;
private @Nullable DynamoDBClient db;
private final Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
private ItemRegistry itemRegistry;
private @Nullable DynamoDbEnhancedAsyncClient client;
private @Nullable DynamoDbAsyncClient lowLevelClient;
private final static Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
private boolean isProperlyConfigured;
private @NonNullByDefault({}) DynamoDBConfig dbConfig;
private @NonNullByDefault({}) DynamoDBTableNameResolver tableNameResolver;
private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1,
new NamedThreadFactory(DYNAMODB_THREADPOOL_NAME));
private @Nullable ScheduledFuture<?> writeBufferedDataFuture;
private @Nullable DynamoDBConfig dbConfig;
private @Nullable DynamoDBTableNameResolver tableNameResolver;
private final ExecutorService executor = ThreadPoolManager.getPool(DYNAMODB_THREADPOOL_NAME);
private static final Duration TIMEOUT_API_CALL = Duration.ofSeconds(60);
private static final Duration TIMEOUT_API_CALL_ATTEMPT = Duration.ofSeconds(5);
private Map<Class<? extends DynamoDBItem<?>>, DynamoDbAsyncTable<? extends DynamoDBItem<?>>> tableCache = new ConcurrentHashMap<>(
2);
private @Nullable URI endpointOverride;
void overrideConfig(AwsRequestOverrideConfiguration.Builder config) {
config.apiCallAttemptTimeout(TIMEOUT_API_CALL_ATTEMPT).apiCallTimeout(TIMEOUT_API_CALL);
}
void overrideConfig(ClientOverrideConfiguration.Builder config) {
DynamoDBConfig localDbConfig = dbConfig;
config.apiCallAttemptTimeout(TIMEOUT_API_CALL_ATTEMPT).apiCallTimeout(TIMEOUT_API_CALL);
if (localDbConfig != null) {
config.retryPolicy(localDbConfig.getRetryPolicy());
}
}
@Activate
public DynamoDBPersistenceService(final @Reference ItemRegistry itemRegistry) {
@@ -180,26 +135,51 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
}
/**
* For testing. Allows access to underlying DynamoDBClient.
*
* @return DynamoDBClient connected to AWS Dyanamo DB.
* For tests
*/
DynamoDBPersistenceService(final ItemRegistry itemRegistry, @Nullable URI endpointOverride) {
this.itemRegistry = itemRegistry;
this.endpointOverride = endpointOverride;
}
/**
* For tests
*/
@Nullable
DynamoDBClient getDb() {
return db;
URI getEndpointOverride() {
return endpointOverride;
}
@Nullable
DynamoDbAsyncClient getLowLevelClient() {
return lowLevelClient;
}
ExecutorService getExecutor() {
return executor;
}
@Nullable
DynamoDBTableNameResolver getTableNameResolver() {
return tableNameResolver;
}
@Nullable
DynamoDBConfig getDbConfig() {
return dbConfig;
}
@Activate
public void activate(final @Nullable BundleContext bundleContext, final Map<String, Object> config) {
resetClient();
dbConfig = DynamoDBConfig.fromConfig(config);
if (dbConfig == null) {
disconnect();
DynamoDBConfig localDbConfig = dbConfig = DynamoDBConfig.fromConfig(config);
if (localDbConfig == null) {
// Configuration was invalid. Abort service activation.
// Error is already logger in fromConfig.
return;
}
tableNameResolver = new DynamoDBTableNameResolver(dbConfig.getTablePrefix());
tableNameResolver = new DynamoDBTableNameResolver(localDbConfig.getTableRevision(), localDbConfig.getTable(),
localDbConfig.getTablePrefixLegacy());
try {
if (!ensureClient()) {
logger.error("Error creating dynamodb database client. Aborting service activation.");
@@ -210,27 +190,6 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return;
}
writeBufferedDataFuture = null;
resetWithBufferSize(dbConfig.getBufferSize());
long commitIntervalMillis = dbConfig.getBufferCommitIntervalMillis();
if (commitIntervalMillis > 0) {
writeBufferedDataFuture = scheduler.scheduleWithFixedDelay(new Runnable() {
@Override
public void run() {
try {
DynamoDBPersistenceService.this.flushBufferedData();
} catch (RuntimeException e) {
// We want to catch all unexpected exceptions since all unhandled exceptions make
// ScheduledExecutorService halt the regular running of the task.
// It is better to print out the exception, and try again
// (on next cycle)
logger.warn(
"Execution of scheduled flushing of buffered data failed unexpectedly. Ignoring exception, trying again according to configured commit interval of {} ms.",
commitIntervalMillis, e);
}
}
}, 0, commitIntervalMillis, TimeUnit.MILLISECONDS);
}
isProperlyConfigured = true;
logger.debug("dynamodb persistence service activated");
}
@@ -238,24 +197,45 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
@Deactivate
public void deactivate() {
logger.debug("dynamodb persistence service deactivated");
if (writeBufferedDataFuture != null) {
writeBufferedDataFuture.cancel(false);
writeBufferedDataFuture = null;
}
resetClient();
logIfManyQueuedTasks();
disconnect();
}
/**
* Initializes DynamoDBClient (db field)
* Initializes Dynamo DB client and determines schema
*
* If DynamoDBClient constructor throws an exception, error is logged and false is returned.
* If construction fails, error is logged and false is returned.
*
* @return whether initialization was successful.
*/
private boolean ensureClient() {
if (db == null) {
DynamoDBConfig localDbConfig = dbConfig;
if (localDbConfig == null) {
return false;
}
if (client == null) {
try {
db = new DynamoDBClient(dbConfig);
synchronized (this) {
if (this.client != null) {
return true;
}
DynamoDbAsyncClientBuilder lowlevelClientBuilder = DynamoDbAsyncClient.builder()
.credentialsProvider(StaticCredentialsProvider.create(localDbConfig.getCredentials()))
.httpClient(NettyNioAsyncHttpClient.builder().maxConcurrency(MAX_CONCURRENCY).build())
.asyncConfiguration(
ClientAsyncConfiguration.builder()
.advancedOption(SdkAdvancedAsyncClientOption.FUTURE_COMPLETION_EXECUTOR,
executor)
.build())
.overrideConfiguration(this::overrideConfig).region(localDbConfig.getRegion());
if (endpointOverride != null) {
logger.debug("DynamoDB has been overriden to {}", endpointOverride);
lowlevelClientBuilder.endpointOverride(endpointOverride);
}
DynamoDbAsyncClient lowlevelClient = lowlevelClientBuilder.build();
client = DynamoDbEnhancedAsyncClient.builder().dynamoDbClient(lowlevelClient).build();
this.lowLevelClient = lowlevelClient;
}
} catch (Exception e) {
logger.error("Error constructing dynamodb client", e);
return false;
@@ -264,111 +244,84 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return true;
}
@Override
public DynamoDBItem<?> persistenceItemFromState(String name, State state, ZonedDateTime time) {
return AbstractDynamoDBItem.fromState(name, state, time);
}
/**
* Create table (if not present) and wait for table to become active.
*
* Synchronized in order to ensure that at most single thread is creating the table at a time
*
* @param mapper
* @param dtoClass
* @return whether table creation succeeded.
*/
private synchronized boolean createTable(DynamoDBMapper mapper, Class<?> dtoClass) {
if (db == null) {
return false;
private CompletableFuture<Boolean> resolveTableSchema() {
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
DynamoDbAsyncClient localLowLevelClient = lowLevelClient;
if (localTableNameResolver == null || localLowLevelClient == null) {
throw new IllegalStateException("tableNameResolver or localLowLevelClient not available");
}
String tableName;
try {
ProvisionedThroughput provisionedThroughput = new ProvisionedThroughput(dbConfig.getReadCapacityUnits(),
dbConfig.getWriteCapacityUnits());
CreateTableRequest request = mapper.generateCreateTableRequest(dtoClass);
request.setProvisionedThroughput(provisionedThroughput);
if (request.getGlobalSecondaryIndexes() != null) {
for (GlobalSecondaryIndex index : request.getGlobalSecondaryIndexes()) {
index.setProvisionedThroughput(provisionedThroughput);
if (localTableNameResolver.isFullyResolved()) {
return CompletableFuture.completedFuture(true);
} else {
synchronized (localTableNameResolver) {
if (localTableNameResolver.isFullyResolved()) {
return CompletableFuture.completedFuture(true);
}
return localTableNameResolver.resolveSchema(localLowLevelClient,
b -> b.overrideConfiguration(this::overrideConfig), executor).thenApplyAsync(resolved -> {
if (resolved && localTableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY) {
logger.warn(
"Using legacy table format. Is it recommended to migrate to the new table format: specify the 'table' parameter and unset the old 'tablePrefix' parameter.");
}
return resolved;
}, executor);
}
tableName = request.getTableName();
try {
db.getDynamoClient().describeTable(tableName);
} catch (ResourceNotFoundException e) {
// No table present, continue with creation
db.getDynamoClient().createTable(request);
} catch (AmazonClientException e) {
logger.error("Table creation failed due to error in describeTable operation", e);
return false;
}
// table found or just created, wait
return waitForTableToBecomeActive(tableName);
} catch (AmazonClientException e) {
logger.error("Exception when creating table", e);
return false;
}
}
private boolean waitForTableToBecomeActive(String tableName) {
try {
logger.debug("Checking if table '{}' is created...", tableName);
final TableDescription tableDescription;
try {
tableDescription = db.getDynamoDB().getTable(tableName).waitForActive();
} catch (IllegalArgumentException e) {
logger.warn("Table '{}' is being deleted: {} {}", tableName, e.getClass().getSimpleName(),
e.getMessage());
return false;
} catch (ResourceNotFoundException e) {
logger.warn("Table '{}' was deleted unexpectedly: {} {}", tableName, e.getClass().getSimpleName(),
e.getMessage());
return false;
}
boolean success = TableStatus.ACTIVE.equals(TableStatus.fromValue(tableDescription.getTableStatus()));
if (success) {
logger.debug("Creation of table '{}' successful, table status is now {}", tableName,
tableDescription.getTableStatus());
} else {
logger.warn("Creation of table '{}' unsuccessful, table status is now {}", tableName,
tableDescription.getTableStatus());
}
return success;
} catch (AmazonClientException e) {
logger.error("Exception when checking table status (describe): {}", e.getMessage());
return false;
} catch (InterruptedException e) {
logger.error("Interrupted while trying to check table status: {}", e.getMessage());
return false;
private <T extends DynamoDBItem<?>> DynamoDbAsyncTable<T> getTable(Class<T> dtoClass) {
DynamoDbEnhancedAsyncClient localClient = client;
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
if (!ensureClient() || localClient == null || localTableNameResolver == null) {
throw new IllegalStateException("Client not ready");
}
ExpectedTableSchema expectedTableSchemaRevision = localTableNameResolver.getTableSchema();
String tableName = localTableNameResolver.fromClass(dtoClass);
final TableSchema<T> schema = getDynamoDBTableSchema(dtoClass, expectedTableSchemaRevision);
@SuppressWarnings("unchecked") // OK since this is the only place tableCache is populated
DynamoDbAsyncTable<T> table = (DynamoDbAsyncTable<T>) tableCache.computeIfAbsent(dtoClass, clz -> {
return localClient.table(tableName, schema);
});
if (table == null) {
// Invariant. To make null checker happy
throw new IllegalStateException();
}
return table;
}
private static <T extends DynamoDBItem<?>> TableSchema<T> getDynamoDBTableSchema(Class<T> dtoClass,
ExpectedTableSchema expectedTableSchemaRevision) {
if (dtoClass.equals(DynamoDBBigDecimalItem.class)) {
@SuppressWarnings("unchecked") // OK thanks to above conditional
TableSchema<T> schema = (TableSchema<T>) (expectedTableSchemaRevision == ExpectedTableSchema.NEW
? DynamoDBBigDecimalItem.TABLE_SCHEMA_NEW
: DynamoDBBigDecimalItem.TABLE_SCHEMA_LEGACY);
return schema;
} else if (dtoClass.equals(DynamoDBStringItem.class)) {
@SuppressWarnings("unchecked") // OK thanks to above conditional
TableSchema<T> schema = (TableSchema<T>) (expectedTableSchemaRevision == ExpectedTableSchema.NEW
? DynamoDBStringItem.TABLE_SCHEMA_NEW
: DynamoDBStringItem.TABLE_SCHEMA_LEGACY);
return schema;
} else {
throw new IllegalStateException("Unknown DTO class. Bug");
}
}
private void resetClient() {
if (db == null) {
private void disconnect() {
DynamoDbAsyncClient localLowLevelClient = lowLevelClient;
if (client == null || localLowLevelClient == null) {
return;
}
db.shutdown();
db = null;
localLowLevelClient.close();
lowLevelClient = null;
client = null;
dbConfig = null;
tableNameResolver = null;
isProperlyConfigured = false;
tableCache.clear();
}
private DynamoDBMapper getDBMapper(String tableName) {
try {
DynamoDBMapperConfig mapperConfig = new DynamoDBMapperConfig.Builder()
.withTableNameOverride(new DynamoDBMapperConfig.TableNameOverride(tableName))
.withPaginationLoadingStrategy(PaginationLoadingStrategy.LAZY_LOADING).build();
return new DynamoDBMapper(db.getDynamoClient(), mapperConfig);
} catch (AmazonClientException e) {
logger.error("Error getting db mapper: {}", e.getMessage());
throw e;
}
}
@Override
protected boolean isReadyToStore() {
return isProperlyConfigured && ensureClient();
}
@@ -388,160 +341,123 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
return Collections.emptySet();
}
@Override
protected void flushBufferedData() {
if (buffer != null && buffer.isEmpty()) {
return;
}
logger.debug("Writing buffered data. Buffer size: {}", buffer.size());
for (;;) {
Map<String, Deque<DynamoDBItem<?>>> itemsByTable = readBuffer();
// Write batch of data, one table at a time
for (Entry<String, Deque<DynamoDBItem<?>>> entry : itemsByTable.entrySet()) {
String tableName = entry.getKey();
Deque<DynamoDBItem<?>> batch = entry.getValue();
if (!batch.isEmpty()) {
flushBatch(getDBMapper(tableName), batch);
}
}
if (buffer != null && buffer.isEmpty()) {
break;
}
}
}
private Map<String, Deque<DynamoDBItem<?>>> readBuffer() {
Map<String, Deque<DynamoDBItem<?>>> batchesByTable = new HashMap<>(2);
// Get batch of data
while (!buffer.isEmpty()) {
DynamoDBItem<?> dynamoItem = buffer.poll();
if (dynamoItem == null) {
break;
}
String tableName = tableNameResolver.fromItem(dynamoItem);
Deque<DynamoDBItem<?>> batch = batchesByTable.computeIfAbsent(tableName, new Function<>() {
@Override
public @Nullable Deque<DynamoDBItem<?>> apply(@Nullable String t) {
return new ArrayDeque<>();
}
});
batch.add(dynamoItem);
}
return batchesByTable;
}
/**
* Flush batch of data to DynamoDB
*
* @param mapper mapper associated with the batch
* @param batch batch of data to write to DynamoDB
*/
private void flushBatch(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch) {
long currentTimeMillis = System.currentTimeMillis();
List<FailedBatch> failed = mapper.batchSave(batch);
for (FailedBatch failedBatch : failed) {
if (failedBatch.getException() instanceof ResourceNotFoundException) {
// Table did not exist. Try again after creating table
retryFlushAfterCreatingTable(mapper, batch, failedBatch);
} else {
logger.debug("Batch failed with {}. Retrying next with exponential back-off",
failedBatch.getException().getMessage());
new ExponentialBackoffRetry(failedBatch.getUnprocessedItems()).run();
}
}
if (failed.isEmpty()) {
logger.debug("flushBatch ended with {} items in {} ms: {}", batch.size(),
System.currentTimeMillis() - currentTimeMillis, batch);
} else {
logger.warn(
"flushBatch ended with {} items in {} ms: {}. There were some failed batches that were retried -- check logs for ERRORs to see if writes were successful",
batch.size(), System.currentTimeMillis() - currentTimeMillis, batch);
}
}
/**
* Retry flushing data after creating table associated with mapper
*
* @param mapper mapper associated with the batch
* @param batch original batch of data. Used for logging and to determine table name
* @param failedBatch failed batch that should be retried
*/
private void retryFlushAfterCreatingTable(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch,
FailedBatch failedBatch) {
logger.debug("Table was not found. Trying to create table and try saving again");
if (createTable(mapper, batch.peek().getClass())) {
logger.debug("Table creation successful, trying to save again");
if (!failedBatch.getUnprocessedItems().isEmpty()) {
ExponentialBackoffRetry retry = new ExponentialBackoffRetry(failedBatch.getUnprocessedItems());
retry.run();
if (retry.getUnprocessedItems().isEmpty()) {
logger.debug("Successfully saved items after table creation");
}
}
} else {
logger.warn("Table creation failed. Not storing some parts of batch: {}. Unprocessed items: {}", batch,
failedBatch.getUnprocessedItems());
}
}
@Override
public Iterable<HistoricItem> query(FilterCriteria filter) {
logger.debug("got a query");
logIfManyQueuedTasks();
Instant start = Instant.now();
String filterDescription = filterToString(filter);
logger.trace("Got a query with filter {}", filterDescription);
DynamoDbEnhancedAsyncClient localClient = client;
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
if (!isProperlyConfigured) {
logger.debug("Configuration for dynamodb not yet loaded or broken. Not storing item.");
logger.debug("Configuration for dynamodb not yet loaded or broken. Returning empty query results.");
return Collections.emptyList();
}
if (!ensureClient()) {
logger.warn("DynamoDB not connected. Not storing item.");
if (!ensureClient() || localClient == null || localTableNameResolver == null) {
logger.warn("DynamoDB not connected. Returning empty query results.");
return Collections.emptyList();
}
String itemName = filter.getItemName();
Item item = getItemFromRegistry(itemName);
if (item == null) {
logger.warn("Could not get item {} from registry!", itemName);
return Collections.emptyList();
}
Class<DynamoDBItem<?>> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass());
String tableName = tableNameResolver.fromClass(dtoClass);
DynamoDBMapper mapper = getDBMapper(tableName);
logger.debug("item {} (class {}) will be tried to query using dto class {} from table {}", itemName,
item.getClass(), dtoClass, tableName);
List<HistoricItem> historicItems = new ArrayList<>();
DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
filter);
@SuppressWarnings("rawtypes")
final PaginatedQueryList<? extends DynamoDBItem> paginatedList;
//
// Resolve unclear table schema if needed
//
try {
paginatedList = mapper.query(dtoClass, queryExpression);
} catch (AmazonServiceException e) {
logger.error(
"DynamoDB query raised unexpected exception: {}. Returning empty collection. "
+ "Status code 400 (resource not found) might occur if table was just created.",
e.getMessage());
Boolean resolved = resolveTableSchema().get();
if (!resolved) {
logger.warn("Table schema not resolved, cannot query data.");
return Collections.emptyList();
}
} catch (InterruptedException e) {
logger.warn("Table schema resolution interrupted, cannot query data");
return Collections.emptyList();
} catch (ExecutionException e) {
Throwable cause = e.getCause();
logger.warn("Table schema resolution errored, cannot query data: {} {}",
cause == null ? e.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? e.getMessage() : cause.getMessage());
return Collections.emptyList();
}
for (int itemIndexOnPage = 0; itemIndexOnPage < filter.getPageSize(); itemIndexOnPage++) {
int itemIndex = filter.getPageNumber() * filter.getPageSize() + itemIndexOnPage;
DynamoDBItem<?> dynamoItem;
try {
dynamoItem = paginatedList.get(itemIndex);
} catch (IndexOutOfBoundsException e) {
logger.debug("Index {} is out-of-bounds", itemIndex);
break;
try {
//
// Proceed with query
//
String itemName = filter.getItemName();
Item item = getItemFromRegistry(itemName);
if (item == null) {
logger.warn("Could not get item {} from registry! Returning empty query results.", itemName);
return Collections.emptyList();
}
if (dynamoItem != null) {
HistoricItem historicItem = dynamoItem.asHistoricItem(item);
logger.trace("Dynamo item {} converted to historic item: {}", item, historicItem);
historicItems.add(historicItem);
if (item instanceof GroupItem) {
item = ((GroupItem) item).getBaseItem();
logger.debug("Item is instanceof GroupItem '{}'", itemName);
if (item == null) {
logger.debug("BaseItem of GroupItem is null. Ignore and give up!");
return Collections.emptyList();
}
if (item instanceof GroupItem) {
logger.debug("BaseItem of GroupItem is a GroupItem too. Ignore and give up!");
return Collections.emptyList();
}
}
boolean legacy = localTableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY;
Class<? extends DynamoDBItem<?>> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass(),
legacy);
String tableName = localTableNameResolver.fromClass(dtoClass);
DynamoDbAsyncTable<? extends DynamoDBItem<?>> table = getTable(dtoClass);
logger.debug("Item {} (of type {}) will be tried to query using DTO class {} from table {}", itemName,
item.getClass().getSimpleName(), dtoClass.getSimpleName(), tableName);
QueryEnhancedRequest queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
localTableNameResolver.getTableSchema(), item, filter);
CompletableFuture<List<DynamoDBItem<?>>> itemsFuture = new CompletableFuture<>();
final SdkPublisher<? extends DynamoDBItem<?>> itemPublisher = table.query(queryExpression).items();
Subscriber<DynamoDBItem<?>> pageSubscriber = new PageOfInterestSubscriber<DynamoDBItem<?>>(itemsFuture,
filter.getPageNumber(), filter.getPageSize());
itemPublisher.subscribe(pageSubscriber);
// NumberItem.getUnit() is expensive, we avoid calling it in the loop
// by fetching the unit here.
final Item localItem = item;
final Unit<?> itemUnit = localItem instanceof NumberItem ? ((NumberItem) localItem).getUnit() : null;
try {
@SuppressWarnings("null")
List<HistoricItem> results = itemsFuture.get().stream().map(dynamoItem -> {
HistoricItem historicItem = dynamoItem.asHistoricItem(localItem, itemUnit);
if (historicItem == null) {
logger.warn(
"Dynamo item {} serialized state '{}' cannot be converted to item {} {}. Item type changed since persistence. Ignoring",
dynamoItem.getClass().getSimpleName(), dynamoItem.getState(),
localItem.getClass().getSimpleName(), localItem.getName());
return null;
}
logger.trace("Dynamo item {} converted to historic item: {}", localItem, historicItem);
return historicItem;
}).filter(value -> value != null).collect(Collectors.toList());
logger.debug("Query completed in {} ms. Filter was {}",
Duration.between(start, Instant.now()).toMillis(), filterDescription);
return results;
} catch (InterruptedException e) {
logger.warn("Query interrupted. Filter was {}", filterDescription);
return Collections.emptyList();
} catch (ExecutionException e) {
Throwable cause = e.getCause();
if (cause instanceof ResourceNotFoundException) {
logger.trace("Query failed since the DynamoDB table '{}' does not exist. Filter was {}", tableName,
filterDescription);
} else if (logger.isTraceEnabled()) {
logger.trace("Query failed. Filter was {}", filterDescription, e);
} else {
logger.warn("Query failed {} {}. Filter was {}",
cause == null ? e.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? e.getMessage() : cause.getMessage(), filterDescription);
}
return Collections.emptyList();
}
} catch (Exception e) {
logger.error("Unexpected error with query having filter {}: {} {}. Returning empty query results.",
filterDescription, e.getClass().getSimpleName(), e.getMessage());
return Collections.emptyList();
}
return historicItems;
}
/**
@@ -551,19 +467,201 @@ public class DynamoDBPersistenceService extends AbstractBufferedPersistenceServi
* @return item with the given name, or null if no such item exists in item registry.
*/
private @Nullable Item getItemFromRegistry(String itemName) {
Item item = null;
try {
if (itemRegistry != null) {
item = itemRegistry.getItem(itemName);
}
return itemRegistry.getItem(itemName);
} catch (ItemNotFoundException e1) {
logger.error("Unable to get item {} from registry", itemName);
return null;
}
return item;
}
@Override
public List<PersistenceStrategy> getDefaultStrategies() {
return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
}
@Override
public void store(Item item) {
store(item, null);
}
@Override
public void store(Item item, @Nullable String alias) {
// Timestamp and capture state immediately as rest of the store is asynchronous (state might change in between)
ZonedDateTime time = ZonedDateTime.now();
logIfManyQueuedTasks();
if (!(item instanceof GenericItem)) {
return;
}
if (item.getState() instanceof UnDefType) {
logger.debug("Undefined item state received. Not storing item {}.", item.getName());
return;
}
if (!isReadyToStore()) {
logger.warn("Not ready to store (config error?), not storing item {}.", item.getName());
return;
}
// Get Item describing the real type of data
// With non-group items this is same as the argument item. With Group items, this is item describing the type of
// state stored in the group.
final Item itemTemplate;
try {
itemTemplate = getEffectiveItem(item);
} catch (IllegalStateException e) {
// Exception is raised when underlying item type cannot be determined with Group item
// Logged already
return;
}
String effectiveName = (alias != null) ? alias : item.getName();
// We do not want to rely item.state since async context below can execute much later.
// We 'copy' the item for local use. copyItem also normalizes the unit with NumberItems.
final GenericItem copiedItem = copyItem(itemTemplate, item, effectiveName, null);
resolveTableSchema().thenAcceptAsync(resolved -> {
if (!resolved) {
logger.warn("Table schema not resolved, not storing item {}.", copiedItem.getName());
return;
}
DynamoDbEnhancedAsyncClient localClient = client;
DynamoDbAsyncClient localLowlevelClient = lowLevelClient;
DynamoDBConfig localConfig = dbConfig;
DynamoDBTableNameResolver localTableNameResolver = tableNameResolver;
if (!isProperlyConfigured || localClient == null || localLowlevelClient == null || localConfig == null
|| localTableNameResolver == null) {
logger.warn("Not ready to store (config error?), not storing item {}.", item.getName());
return;
}
Integer expireDays = localConfig.getExpireDays();
final DynamoDBItem<?> dto;
switch (localTableNameResolver.getTableSchema()) {
case NEW:
dto = AbstractDynamoDBItem.fromStateNew(copiedItem, time, expireDays);
break;
case LEGACY:
dto = AbstractDynamoDBItem.fromStateLegacy(copiedItem, time);
break;
default:
throw new IllegalStateException("Unexpected. Bug");
}
logger.trace("store() called with item {} {} '{}', which was converted to DTO {}",
copiedItem.getClass().getSimpleName(), effectiveName, copiedItem.getState(), dto);
dto.accept(new DynamoDBItemVisitor<TableCreatingPutItem<? extends DynamoDBItem<?>>>() {
@Override
public TableCreatingPutItem<? extends DynamoDBItem<?>> visit(
DynamoDBBigDecimalItem dynamoBigDecimalItem) {
return new TableCreatingPutItem<DynamoDBBigDecimalItem>(DynamoDBPersistenceService.this,
dynamoBigDecimalItem, getTable(DynamoDBBigDecimalItem.class));
}
@Override
public TableCreatingPutItem<? extends DynamoDBItem<?>> visit(DynamoDBStringItem dynamoStringItem) {
return new TableCreatingPutItem<DynamoDBStringItem>(DynamoDBPersistenceService.this,
dynamoStringItem, getTable(DynamoDBStringItem.class));
}
}).putItemAsync();
}, executor).exceptionally(e -> {
logger.error("Unexcepted error", e);
return null;
});
}
private Item getEffectiveItem(Item item) {
final Item effectiveItem;
if (item instanceof GroupItem) {
Item baseItem = ((GroupItem) item).getBaseItem();
if (baseItem == null) {
// if GroupItem:<ItemType> is not defined in
// *.items using StringType
logger.debug(
"Cannot detect ItemType for {} because the GroupItems' base type isn't set in *.items File.",
item.getName());
Iterator<Item> firstGroupMemberItem = ((GroupItem) item).getMembers().iterator();
if (firstGroupMemberItem.hasNext()) {
effectiveItem = firstGroupMemberItem.next();
} else {
throw new IllegalStateException("GroupItem " + item.getName()
+ " does not have children nor base item set, cannot determine underlying item type. Aborting!");
}
} else {
effectiveItem = baseItem;
}
} else {
effectiveItem = item;
}
return effectiveItem;
}
/**
* Copy item and optionally override name and state
*
* State is normalized to source item's unit with Quantity NumberItems and QuantityTypes
*
* @param itemTemplate 'template item' to be used to construct the new copy. It is also used to determine UoM unit
* and get GenericItem.type
* @param item item that is used to acquire name and state
* @param nameOverride name override for the resulting copy
* @param stateOverride state override for the resulting copy
* @throws IllegalArgumentException when state is QuantityType and not compatible with item
*/
static GenericItem copyItem(Item itemTemplate, Item item, @Nullable String nameOverride,
@Nullable State stateOverride) {
final GenericItem copiedItem;
try {
if (itemTemplate instanceof NumberItem) {
copiedItem = (GenericItem) itemTemplate.getClass().getDeclaredConstructor(String.class, String.class)
.newInstance(itemTemplate.getType(), nameOverride == null ? item.getName() : nameOverride);
} else {
copiedItem = (GenericItem) itemTemplate.getClass().getDeclaredConstructor(String.class)
.newInstance(nameOverride == null ? item.getName() : nameOverride);
}
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException
| NoSuchMethodException | SecurityException e) {
throw new IllegalArgumentException(item.toString(), e);
}
State state = stateOverride == null ? item.getState() : stateOverride;
if (state instanceof QuantityType<?> && itemTemplate instanceof NumberItem) {
Unit<?> itemUnit = ((NumberItem) itemTemplate).getUnit();
if (itemUnit != null) {
State convertedState = ((QuantityType<?>) state).toUnit(itemUnit);
if (convertedState == null) {
logger.error("Unexpected unit conversion failure: {} to item unit {}", state, itemUnit);
throw new IllegalArgumentException(
String.format("Unexpected unit conversion failure: %s to item unit %s", state, itemUnit));
}
state = convertedState;
}
}
copiedItem.setState(state);
return copiedItem;
}
private void logIfManyQueuedTasks() {
if (executor instanceof ThreadPoolExecutor) {
ThreadPoolExecutor localExecutor = (ThreadPoolExecutor) executor;
if (localExecutor.getQueue().size() >= 5) {
logger.trace("executor queue size: {}, remaining space {}. Active threads {}",
localExecutor.getQueue().size(), localExecutor.getQueue().remainingCapacity(),
localExecutor.getActiveCount());
} else if (localExecutor.getQueue().size() >= 50) {
logger.warn(
"Many ({}) tasks queued in executor! This might be sign of bad design or bug in the addon code.",
localExecutor.getQueue().size());
}
}
}
private String filterToString(FilterCriteria filter) {
return String.format(
"FilterCriteria@%s(item=%s, pageNumber=%d, pageSize=%d, time=[%s, %s, %s], state=[%s, %s of %s] )",
System.identityHashCode(filter), filter.getItemName(), filter.getPageNumber(), filter.getPageSize(),
filter.getBeginDate(), filter.getEndDate(), filter.getOrdering(), filter.getOperator(),
filter.getState(), filter.getState() == null ? "null" : filter.getState().getClass().getSimpleName());
}
}

View File

@@ -12,19 +12,23 @@
*/
package org.openhab.persistence.dynamodb.internal;
import java.lang.reflect.InvocationTargetException;
import java.time.ZonedDateTime;
import java.util.Collections;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.openhab.core.items.GenericItem;
import org.openhab.core.items.Item;
import org.openhab.core.persistence.FilterCriteria;
import org.openhab.core.persistence.FilterCriteria.Operator;
import org.openhab.core.persistence.FilterCriteria.Ordering;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
import com.amazonaws.services.dynamodbv2.model.AttributeValue;
import com.amazonaws.services.dynamodbv2.model.ComparisonOperator;
import com.amazonaws.services.dynamodbv2.model.Condition;
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.Expression;
import software.amazon.awssdk.enhanced.dynamodb.Expression.Builder;
import software.amazon.awssdk.enhanced.dynamodb.model.QueryConditional;
import software.amazon.awssdk.enhanced.dynamodb.model.QueryEnhancedRequest;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
/**
* Utility class
@@ -36,88 +40,141 @@ public class DynamoDBQueryUtils {
/**
* Construct dynamodb query from filter
*
* @param filter
* @param dtoClass dto class
* @param expectedTableSchema table schema to query against
* @param item item corresponding to filter
* @param filter filter for the query
* @return DynamoDBQueryExpression corresponding to the given FilterCriteria
* @throws IllegalArgumentException when schema is not fully resolved
*/
public static DynamoDBQueryExpression<DynamoDBItem<?>> createQueryExpression(
Class<? extends DynamoDBItem<?>> dtoClass, FilterCriteria filter) {
DynamoDBItem<?> item = getDynamoDBHashKey(dtoClass, filter.getItemName());
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = new DynamoDBQueryExpression<DynamoDBItem<?>>()
.withHashKeyValues(item).withScanIndexForward(filter.getOrdering() == Ordering.ASCENDING)
.withLimit(filter.getPageSize());
maybeAddTimeFilter(queryExpression, filter);
maybeAddStateFilter(filter, queryExpression);
return queryExpression;
}
private static DynamoDBItem<?> getDynamoDBHashKey(Class<? extends DynamoDBItem<?>> dtoClass, String itemName) {
DynamoDBItem<?> item;
try {
item = dtoClass.newInstance();
} catch (InstantiationException e) {
throw new RuntimeException(e);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
public static QueryEnhancedRequest createQueryExpression(Class<? extends DynamoDBItem<?>> dtoClass,
ExpectedTableSchema expectedTableSchema, Item item, FilterCriteria filter) {
if (!expectedTableSchema.isFullyResolved()) {
throw new IllegalArgumentException("Schema not resolved");
}
item.setName(itemName);
return item;
QueryEnhancedRequest.Builder queryBuilder = QueryEnhancedRequest.builder()
.scanIndexForward(filter.getOrdering() == Ordering.ASCENDING);
addFilterbyItemAndTimeFilter(queryBuilder, expectedTableSchema, filter.getItemName(), filter);
addStateFilter(queryBuilder, expectedTableSchema, item, dtoClass, filter);
addProjection(dtoClass, expectedTableSchema, queryBuilder);
return queryBuilder.build();
}
private static void maybeAddStateFilter(FilterCriteria filter,
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression) {
if (filter.getOperator() != null && filter.getState() != null) {
// Convert filter's state to DynamoDBItem in order get suitable string representation for the state
final DynamoDBItem<?> filterState = AbstractDynamoDBItem.fromState(filter.getItemName(), filter.getState(),
ZonedDateTime.now());
queryExpression.setFilterExpression(String.format("%s %s :opstate", DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE,
operatorAsString(filter.getOperator())));
filterState.accept(new DynamoDBItemVisitor() {
/**
* Add projection for key parameters only, not expire date
*/
private static void addProjection(Class<? extends DynamoDBItem<?>> dtoClass,
ExpectedTableSchema expectedTableSchema, QueryEnhancedRequest.Builder queryBuilder) {
boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
if (legacy) {
queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME_LEGACY,
DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC_LEGACY, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY);
} else {
acceptAsEmptyDTO(dtoClass, new DynamoDBItemVisitor<@Nullable Void>() {
@Override
public void visit(DynamoDBStringItem dynamoStringItem) {
queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
new AttributeValue().withS(dynamoStringItem.getState())));
public @Nullable Void visit(DynamoDBStringItem dynamoStringItem) {
queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME,
DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
return null;
}
@Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
new AttributeValue().withN(dynamoBigDecimalItem.getState().toPlainString())));
public @Nullable Void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
queryBuilder.attributesToProject(DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME,
DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
return null;
}
});
}
}
private static @Nullable Condition maybeAddTimeFilter(
final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression, final FilterCriteria filter) {
final Condition timeCondition = constructTimeCondition(filter);
if (timeCondition != null) {
queryExpression.setRangeKeyConditions(
Collections.singletonMap(DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, timeCondition));
private static void addStateFilter(QueryEnhancedRequest.Builder queryBuilder,
ExpectedTableSchema expectedTableSchema, Item item, Class<? extends DynamoDBItem<?>> dtoClass,
FilterCriteria filter) {
final Expression expression;
Builder itemStateTypeExpressionBuilder = Expression.builder()
.expression(String.format("attribute_exists(#attr)"));
boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
acceptAsEmptyDTO(dtoClass, new DynamoDBItemVisitor<@Nullable Void>() {
@Override
public @Nullable Void visit(DynamoDBStringItem dynamoStringItem) {
itemStateTypeExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
return null;
}
@Override
public @Nullable Void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
itemStateTypeExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
return null;
}
});
if (filter.getOperator() != null && filter.getState() != null) {
// Convert filter's state to DynamoDBItem in order get suitable string representation for the state
Expression.Builder stateFilterExpressionBuilder = Expression.builder()
.expression(String.format("#attr %s :value", operatorAsString(filter.getOperator())));
// Following will throw IllegalArgumentException when filter state is not compatible with
// item. This is acceptable.
GenericItem stateToFind = DynamoDBPersistenceService.copyItem(item, item, filter.getItemName(),
filter.getState());
acceptAsDTO(stateToFind, legacy, new DynamoDBItemVisitor<@Nullable Void>() {
@Override
public @Nullable Void visit(DynamoDBStringItem serialized) {
stateFilterExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING);
stateFilterExpressionBuilder.putExpressionValue(":value",
AttributeValue.builder().s(serialized.getState()).build());
return null;
}
@SuppressWarnings("null")
@Override
public @Nullable Void visit(DynamoDBBigDecimalItem serialized) {
stateFilterExpressionBuilder.putExpressionName("#attr",
legacy ? DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY
: DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_NUMBER);
stateFilterExpressionBuilder.putExpressionValue(":value",
AttributeValue.builder().n(serialized.getState().toPlainString()).build());
return null;
}
});
expression = Expression.join(stateFilterExpressionBuilder.build(), itemStateTypeExpressionBuilder.build(),
"AND");
queryBuilder.filterExpression(expression);
} else {
expression = itemStateTypeExpressionBuilder.build();
}
return timeCondition;
queryBuilder.filterExpression(expression);
}
private static @Nullable Condition constructTimeCondition(FilterCriteria filter) {
private static void addFilterbyItemAndTimeFilter(QueryEnhancedRequest.Builder queryBuilder,
ExpectedTableSchema expectedTableSchema, String partition, final FilterCriteria filter) {
boolean hasBegin = filter.getBeginDate() != null;
boolean hasEnd = filter.getEndDate() != null;
boolean legacy = expectedTableSchema == ExpectedTableSchema.LEGACY;
AttributeConverter<ZonedDateTime> timeConverter = AbstractDynamoDBItem.getTimestampConverter(legacy);
final Condition timeCondition;
if (!hasBegin && !hasEnd) {
timeCondition = null;
// No need to place time filter filter but we do filter by partition
queryBuilder.queryConditional(QueryConditional.keyEqualTo(k -> k.partitionValue(partition)));
} else if (hasBegin && !hasEnd) {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.GE).withAttributeValueList(
new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
queryBuilder.queryConditional(QueryConditional.sortGreaterThan(
k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getBeginDate()))));
} else if (!hasBegin && hasEnd) {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.LE).withAttributeValueList(
new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
queryBuilder.queryConditional(QueryConditional.sortLessThan(
k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getEndDate()))));
} else {
timeCondition = new Condition().withComparisonOperator(ComparisonOperator.BETWEEN).withAttributeValueList(
new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER)),
new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
assert hasBegin && hasEnd; // invariant
queryBuilder.queryConditional(QueryConditional.sortBetween(
k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getBeginDate())),
k -> k.partitionValue(partition).sortValue(timeConverter.transformFrom(filter.getEndDate()))));
}
return timeCondition;
}
/**
@@ -145,4 +202,23 @@ public class DynamoDBQueryUtils {
throw new IllegalStateException("Unknown operator " + op);
}
}
private static <T> void acceptAsDTO(Item item, boolean legacy, DynamoDBItemVisitor<T> visitor) {
ZonedDateTime dummyTimestamp = ZonedDateTime.now();
if (legacy) {
AbstractDynamoDBItem.fromStateLegacy(item, dummyTimestamp).accept(visitor);
} else {
AbstractDynamoDBItem.fromStateNew(item, dummyTimestamp, null).accept(visitor);
}
}
private static <T> void acceptAsEmptyDTO(Class<? extends DynamoDBItem<?>> dtoClass,
DynamoDBItemVisitor<T> visitor) {
try {
dtoClass.getDeclaredConstructor().newInstance().accept(visitor);
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException
| NoSuchMethodException | SecurityException e) {
throw new IllegalStateException(e);
}
}
}

View File

@@ -14,64 +14,59 @@ package org.openhab.persistence.dynamodb.internal;
import java.time.ZonedDateTime;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTypeConverted;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import software.amazon.awssdk.enhanced.dynamodb.mapper.StaticTableSchema;
/**
* DynamoDBItem for items that can be serialized as DynamoDB string
*
* @author Sami Salonen - Initial contribution
*/
@DynamoDBDocument
@NonNullByDefault
public class DynamoDBStringItem extends AbstractDynamoDBItem<String> {
private static Class<@Nullable String> NULLABLE_STRING = (Class<@Nullable String>) String.class;
public static final StaticTableSchema<DynamoDBStringItem> TABLE_SCHEMA_LEGACY = getBaseSchemaBuilder(
DynamoDBStringItem.class, true)
.newItemSupplier(
DynamoDBStringItem::new)
.addAttribute(NULLABLE_STRING, a -> a.name(DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_LEGACY)
.getter(DynamoDBStringItem::getState).setter(DynamoDBStringItem::setState))
.build();
public static final StaticTableSchema<DynamoDBStringItem> TABLE_SCHEMA_NEW = getBaseSchemaBuilder(
DynamoDBStringItem.class, false)
.newItemSupplier(DynamoDBStringItem::new)
.addAttribute(NULLABLE_STRING,
a -> a.name(DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE_STRING)
.getter(DynamoDBStringItem::getState).setter(DynamoDBStringItem::setState))
.addAttribute(NULLABLE_LONG, a -> a.name(ATTRIBUTE_NAME_EXPIRY)
.getter(AbstractDynamoDBItem::getExpiryDate).setter(AbstractDynamoDBItem::setExpiry))
.build();
public DynamoDBStringItem() {
this(null, null, null);
this("", null, ZonedDateTime.now(), null);
}
public DynamoDBStringItem(String name, String state, ZonedDateTime time) {
super(name, state, time);
public DynamoDBStringItem(String name, @Nullable String state, ZonedDateTime time, @Nullable Integer expireDays) {
super(name, state, time, expireDays);
}
@DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
@Override
public String getState() {
public @Nullable String getState() {
return state;
}
@DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
@Override
public String getName() {
return name;
}
@Override
@DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
@DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public ZonedDateTime getTime() {
return time;
}
@Override
public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
visitor.visit(this);
}
@Override
public void setName(String name) {
this.name = name;
}
@Override
public void setState(String state) {
public void setState(@Nullable String state) {
this.state = state;
}
@Override
public void setTime(ZonedDateTime time) {
this.time = time;
public <T> T accept(DynamoDBItemVisitor<T> visitor) {
return visitor.visit(this);
}
}

View File

@@ -12,37 +12,111 @@
*/
package org.openhab.persistence.dynamodb.internal;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutorService;
import java.util.function.Consumer;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.DescribeTableRequest;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
import software.amazon.awssdk.services.dynamodb.model.TableStatus;
/**
* The DynamoDBTableNameResolver resolves DynamoDB table name for a given item.
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class DynamoDBTableNameResolver {
private final Logger logger = LoggerFactory.getLogger(DynamoDBTableNameResolver.class);
private final String tablePrefix;
private ExpectedTableSchema tableRevision;
private String table;
public DynamoDBTableNameResolver(String tablePrefix) {
public DynamoDBTableNameResolver(ExpectedTableSchema tableRevision, String table, String tablePrefix) {
this.tableRevision = tableRevision;
this.table = table;
this.tablePrefix = tablePrefix;
switch (tableRevision) {
case NEW:
if (table.isBlank()) {
throw new IllegalArgumentException("table should be specified with NEW schema");
}
break;
case MAYBE_LEGACY:
if (table.isBlank()) {
throw new IllegalArgumentException("table should be specified with MAYBE_LEGACY schema");
}
// fall-through
case LEGACY:
if (tablePrefix.isBlank()) {
throw new IllegalArgumentException("tablePrefix should be specified with LEGACY schema");
}
break;
default:
throw new IllegalArgumentException("Bug");
}
}
/**
* Create instance of DynamoDBTableNameResolver using given DynamoDBItem. Item's class is used to determine the
* table name.
*
*
* @param item dto to use to determine table name
* @return table name
* @throws IllegalStateException when table schmea is not determined
*/
public String fromItem(DynamoDBItem<?> item) {
final String[] tableName = new String[1];
if (!isFullyResolved()) {
throw new IllegalStateException();
}
switch (tableRevision) {
case NEW:
return getTableNameAccordingToNewSchema();
case LEGACY:
return getTableNameAccordingToLegacySchema(item);
default:
throw new IllegalArgumentException("Bug");
}
}
/**
* Get table name according to new schema. This instance does not have to have fully determined schema
*
* @return table name
*/
private String getTableNameAccordingToNewSchema() {
return table;
}
/**
* Get table name according to legacy schema. This instance does not have to have fully determined schema
*
* @param item dto to use to determine table name
* @return table name
*/
private String getTableNameAccordingToLegacySchema(DynamoDBItem<?> item) {
// Use the visitor pattern to deduce the table name
item.accept(new DynamoDBItemVisitor() {
return item.accept(new DynamoDBItemVisitor<String>() {
@Override
public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
tableName[0] = tablePrefix + "bigdecimal";
public String visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
return tablePrefix + "bigdecimal";
}
@Override
public void visit(DynamoDBStringItem dynamoStringItem) {
tableName[0] = tablePrefix + "string";
public String visit(DynamoDBStringItem dynamoStringItem) {
return tablePrefix + "string";
}
});
return tableName[0];
}
/**
@@ -62,4 +136,84 @@ public class DynamoDBTableNameResolver {
}
return this.fromItem(dummy);
}
/**
* Whether we have determined the schema and table names to use
*
* @return true when schema revision is clearly specified
*/
public boolean isFullyResolved() {
return tableRevision.isFullyResolved();
}
public CompletableFuture<Boolean> resolveSchema(DynamoDbAsyncClient lowLevelClient,
Consumer<DescribeTableRequest.Builder> describeTableRequestMutator, ExecutorService executor) {
CompletableFuture<Boolean> resolved = new CompletableFuture<>();
if (isFullyResolved()) {
resolved.complete(true);
}
String numberTableLegacy = getTableNameAccordingToLegacySchema(new DynamoDBBigDecimalItem());
String stringTableLegacy = getTableNameAccordingToLegacySchema(new DynamoDBStringItem());
CompletableFuture<@Nullable Boolean> tableSchemaNumbers = tableIsPresent(lowLevelClient,
describeTableRequestMutator, executor, numberTableLegacy);
CompletableFuture<@Nullable Boolean> tableSchemaStrings = tableIsPresent(lowLevelClient,
describeTableRequestMutator, executor, stringTableLegacy);
tableSchemaNumbers.thenAcceptBothAsync(tableSchemaStrings, (table1Present, table2Present) -> {
if (table1Present != null && table2Present != null) {
// Since the Booleans are not null, we know for sure whether table is present or not
// If old tables do not exist, we default to new table layout/schema
tableRevision = (!table1Present && !table2Present) ? ExpectedTableSchema.NEW
: ExpectedTableSchema.LEGACY;
}
resolved.complete(table1Present != null && table2Present != null);
}, executor).exceptionally(e -> {
// should not happen as individual futures have exceptions handled
logger.error("Unexpected error. BUG", e);
resolved.complete(false);
return null;
});
return resolved;
}
/**
*
* @return whether table exists, or null when state is unknown
*/
private CompletableFuture<@Nullable Boolean> tableIsPresent(DynamoDbAsyncClient lowLevelClient,
Consumer<DescribeTableRequest.Builder> describeTableRequestMutator, ExecutorService executor,
String tableName) {
CompletableFuture<@Nullable Boolean> tableSchema = new CompletableFuture<>();
lowLevelClient.describeTable(b -> b.tableName(tableName).applyMutation(describeTableRequestMutator))
.thenApplyAsync(r -> r.table().tableStatus(), executor)
.thenApplyAsync(tableStatus -> tableIsBeingRemoved(tableStatus) ? false : true)
.thenAccept(r -> tableSchema.complete(r)).exceptionally(exception -> {
Throwable cause = exception.getCause();
if (cause instanceof ResourceNotFoundException) {
tableSchema.complete(false);
} else {
logger.warn(
"Could not verify whether table {} is present: {} {}. Cannot determine table schema.",
tableName,
cause == null ? exception.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? exception.getMessage() : cause.getMessage());
// Other error, we could not resolve schema...
tableSchema.complete(null);
}
return null;
});
return tableSchema;
}
private boolean tableIsBeingRemoved(TableStatus tableStatus) {
return (tableStatus == TableStatus.ARCHIVING || tableStatus == TableStatus.DELETING
|| tableStatus == TableStatus.ARCHIVED);
}
public ExpectedTableSchema getTableSchema() {
return tableRevision;
}
}

View File

@@ -0,0 +1,35 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
* Expected revision of the DynamoDB schema
*
* NEW: Read and create data using new schemas
* LEGACY: Read and create data using old schemas, compatible with first version of DynamoDB persistence addon
* MAYBE_LEGACY: Try to read and create data using old schemas, but fallback to NEW if the old tables do not exist.
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public enum ExpectedTableSchema {
NEW,
LEGACY,
MAYBE_LEGACY;
public boolean isFullyResolved() {
return this != ExpectedTableSchema.MAYBE_LEGACY;
}
}

View File

@@ -0,0 +1,98 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.atomic.AtomicInteger;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.eclipse.jdt.annotation.Nullable;
import org.reactivestreams.Subscriber;
import org.reactivestreams.Subscription;
/**
* Subscriber that subscribes the page of interest
*
* @author Sami Salonen - Initial contribution
*/
@NonNullByDefault
public class PageOfInterestSubscriber<T> implements Subscriber<T> {
private AtomicInteger skipped = new AtomicInteger();
private int skip;
private @Nullable Subscription subscription;
private int pageIndex;
private int pageSize;
private List<T> page;
private CompletableFuture<List<T>> future;
/**
* Create new PageOfInterestSubscriber
*
* @param subscriber subscriber to get the page of interest
* @param pageIndex page index that we want subscribe
* @param pageSize page size
*/
protected PageOfInterestSubscriber(CompletableFuture<List<T>> future, int pageIndex, int pageSize) {
this.future = future;
this.pageIndex = pageIndex;
this.pageSize = pageSize;
this.page = new ArrayList<>();
this.skip = pageIndex * pageSize;
}
@Override
public void onSubscribe(@Nullable Subscription subscription) {
this.subscription = subscription;
if (subscription != null) {
subscription.request(pageSize * (pageIndex + 1));
}
}
@Override
public void onNext(T t) {
Subscription localSubscription = subscription;
if (localSubscription == null) {
throw new IllegalStateException(
"Subscriber API has been contract violated: expecting a non-null subscriber");
}
if (future.isCancelled()) {
localSubscription.cancel();
onError(new InterruptedException());
} else if (skipped.getAndIncrement() >= skip && page.size() < pageSize) {
// We have skipped enough, start accumulating
page.add(t);
if (page.size() == pageSize) {
// We have the full page read
localSubscription.cancel();
onComplete();
}
}
}
@Override
public void onError(@NonNullByDefault({}) Throwable t) {
if (!future.isDone()) {
future.completeExceptionally(t);
}
}
@Override
public void onComplete() {
if (!future.isDone()) {
future.complete(page);
}
}
}

View File

@@ -0,0 +1,255 @@
/**
* Copyright (c) 2010-2021 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.persistence.dynamodb.internal;
import java.time.Duration;
import java.time.Instant;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CompletionException;
import java.util.concurrent.ExecutorService;
import org.eclipse.jdt.annotation.NonNullByDefault;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import software.amazon.awssdk.core.internal.waiters.ResponseOrException;
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbAsyncTable;
import software.amazon.awssdk.enhanced.dynamodb.model.CreateTableEnhancedRequest;
import software.amazon.awssdk.services.dynamodb.DynamoDbAsyncClient;
import software.amazon.awssdk.services.dynamodb.model.DescribeTableResponse;
import software.amazon.awssdk.services.dynamodb.model.ProvisionedThroughput;
import software.amazon.awssdk.services.dynamodb.model.ResourceInUseException;
import software.amazon.awssdk.services.dynamodb.model.ResourceNotFoundException;
/**
* PutItem request which creates table if needed.
*
* Designed such that competing PutItem requests should complete successfully, only one of them
* 'winning the race' and creating the table.
*
*
* PutItem
* . |
* . \ (ERR: ResourceNotFoundException) (1)
* ....|
* ....CreateTable
* ....|.........\
* .... \ (OK)....\ (ERR: ResourceInUseException) (2)
* ......|..................|
* ..... |..................|
* ..... |...........Wait for table to become active
* ..... |......................\
* ..... |......................| (OK)
* ..... |......................|
* ..... |......................PutItem
* ..... |
* ..... |
* ..... Wait for table to become active
* ......|
* .......\
* ........| (OK)
* ........|
* ........\
* ....... Configure TTL (no-op with legacy schema)
* ..........|
* ...........\ (OK)
* ...........|
* ...........PutItem
*
*
* (1) Most likely table does not exist yet
* (2) Raised when Table created by someone else
*
* @author Sami Salonen - Initial contribution
*
*/
@NonNullByDefault
public class TableCreatingPutItem<T extends DynamoDBItem<?>> {
private final Logger logger = LoggerFactory.getLogger(TableCreatingPutItem.class);
private final DynamoDBPersistenceService service;
private T dto;
private DynamoDbAsyncTable<T> table;
private CompletableFuture<Void> aggregateFuture = new CompletableFuture<Void>();
private Instant start = Instant.now();
private ExecutorService executor;
private DynamoDbAsyncClient lowLevelClient;
private DynamoDBConfig dbConfig;
private DynamoDBTableNameResolver tableNameResolver;
public TableCreatingPutItem(DynamoDBPersistenceService service, T dto, DynamoDbAsyncTable<T> table) {
this.service = service;
this.dto = dto;
this.table = table;
this.executor = this.service.getExecutor();
DynamoDbAsyncClient localLowLevelClient = this.service.getLowLevelClient();
DynamoDBConfig localDbConfig = this.service.getDbConfig();
DynamoDBTableNameResolver localTableNameResolver = this.service.getTableNameResolver();
if (localLowLevelClient == null || localDbConfig == null || localTableNameResolver == null) {
throw new IllegalStateException("Service is not ready");
}
lowLevelClient = localLowLevelClient;
dbConfig = localDbConfig;
tableNameResolver = localTableNameResolver;
}
public CompletableFuture<Void> putItemAsync() {
start = Instant.now();
return internalPutItemAsync(false, true);
}
private CompletableFuture<Void> internalPutItemAsync(boolean createTable, boolean recursionAllowed) {
if (createTable) {
// Try again, first creating the table
Instant tableCreationStart = Instant.now();
table.createTable(CreateTableEnhancedRequest.builder()
.provisionedThroughput(
ProvisionedThroughput.builder().readCapacityUnits(dbConfig.getReadCapacityUnits())
.writeCapacityUnits(dbConfig.getWriteCapacityUnits()).build())
.build())//
.whenCompleteAsync((resultTableCreation, exceptionTableCreation) -> {
if (exceptionTableCreation == null) {
logger.trace("PutItem: Table created in {} ms. Proceeding to TTL creation.",
Duration.between(tableCreationStart, Instant.now()).toMillis());
//
// Table creation OK. Configure TTL
//
boolean legacy = tableNameResolver.getTableSchema() == ExpectedTableSchema.LEGACY;
waitForTableToBeActive().thenComposeAsync(_void -> {
if (legacy) {
// We have legacy table schema. TTL configuration is skipped
return CompletableFuture.completedFuture(null);
} else {
// We have the new table schema -> configure TTL
// for the newly created table
return lowLevelClient.updateTimeToLive(req -> req
.overrideConfiguration(this.service::overrideConfig)
.tableName(table.tableName()).timeToLiveSpecification(spec -> spec
.attributeName(DynamoDBItem.ATTRIBUTE_NAME_EXPIRY).enabled(true)));
}
}, executor)
//
// Table is ready and TTL configured (possibly with error)
//
.whenCompleteAsync((resultTTL, exceptionTTL) -> {
if (exceptionTTL == null) {
//
// TTL configuration OK, continue with PutItem
//
logger.trace("PutItem: TTL configured successfully");
internalPutItemAsync(false, false);
} else {
//
// TTL configuration failed, abort
//
logger.trace("PutItem: TTL configuration failed");
Throwable exceptionTTLCause = exceptionTTL.getCause();
aggregateFuture.completeExceptionally(
exceptionTTLCause == null ? exceptionTTL : exceptionTTLCause);
}
}, executor);
} else {
// Table creation failed. We give up and complete the aggregate
// future -- unless the error was ResourceInUseException, in which case wait for
// table to become active and try again
Throwable cause = exceptionTableCreation.getCause();
if (cause instanceof ResourceInUseException) {
logger.trace(
"PutItem: table creation failed (will be retried) with {} {}. Perhaps tried to create table that already exists. Trying one more time without creating table.",
cause.getClass().getSimpleName(), cause.getMessage());
// Wait table to be active, then retry PutItem
waitForTableToBeActive().whenCompleteAsync((_tableWaitResponse, tableWaitException) -> {
if (tableWaitException != null) {
// error when waiting for table to become active
Throwable tableWaitExceptionCause = tableWaitException.getCause();
logger.warn(
"PutItem: failed (final) with {} {} when waiting to become active. Aborting.",
tableWaitExceptionCause == null
? tableWaitException.getClass().getSimpleName()
: tableWaitExceptionCause.getClass().getSimpleName(),
tableWaitExceptionCause == null ? tableWaitException.getMessage()
: tableWaitExceptionCause.getMessage());
aggregateFuture.completeExceptionally(
tableWaitExceptionCause == null ? tableWaitException
: tableWaitExceptionCause);
}
}, executor)
// table wait OK, retry PutItem
.thenRunAsync(() -> internalPutItemAsync(false, false), executor);
} else {
logger.warn("PutItem: failed (final) with {} {}. Aborting.",
cause == null ? exceptionTableCreation.getClass().getSimpleName()
: cause.getClass().getSimpleName(),
cause == null ? exceptionTableCreation.getMessage() : cause.getMessage());
aggregateFuture.completeExceptionally(cause == null ? exceptionTableCreation : cause);
}
}
}, executor);
} else {
// First try, optimistically assuming that table exists
table.putItem(dto).whenCompleteAsync((result, exception) -> {
if (exception == null) {
logger.trace("PutItem: DTO {} was successfully written in {} ms.", dto,
Duration.between(start, Instant.now()).toMillis());
aggregateFuture.complete(result);
} else {
// PutItem failed. We retry i failure was due to non-existing table. Retry is triggered by calling
// this method again with createTable=true)
// With other errors, we abort.
if (!(exception instanceof CompletionException)) {
logger.error("PutItem: Expecting only CompletionException, got {} {}. BUG",
exception.getClass().getName(), exception.getMessage());
aggregateFuture.completeExceptionally(new IllegalStateException("unexpected exception"));
}
Throwable cause = exception.getCause();
if (cause instanceof ResourceNotFoundException && recursionAllowed) {
logger.trace(
"PutItem: Table '{}' was not present. Retrying, this time creating the table first",
table.tableName());
internalPutItemAsync(true, true);
} else {
logger.warn("PutItem: failed (final) with {} {}. Aborting.",
cause == null ? exception.getClass().getSimpleName() : cause.getClass().getSimpleName(),
cause == null ? exception.getMessage() : cause.getMessage());
aggregateFuture.completeExceptionally(cause == null ? exception : cause);
}
}
}, executor);
}
return aggregateFuture;
}
private CompletableFuture<Void> waitForTableToBeActive() {
return lowLevelClient.waiter()
.waitUntilTableExists(
req -> req.tableName(table.tableName()).overrideConfiguration(this.service::overrideConfig))
.thenAcceptAsync(tableWaitResponse -> {
// if waiter fails, the future is completed exceptionally (not entering this step)
ResponseOrException<DescribeTableResponse> responseOrException = tableWaitResponse.matched();
logger.trace("PutItem: Table wait completed sucessfully with {} attempts: {}",
tableWaitResponse.attemptsExecuted(), toString(responseOrException));
}, executor);
}
private String toString(ResponseOrException<?> responseOrException) {
if (responseOrException.response().isPresent()) {
return String.format("response=%s", responseOrException.response().get());
} else if (responseOrException.exception().isPresent()) {
Throwable exception = responseOrException.exception().get();
return String.format("exception=%s %s", exception.getClass().getSimpleName(), exception.getMessage());
} else {
return String.format("<N/A>");
}
}
}

View File

@@ -12,9 +12,17 @@
#
# The following parameters are used to configure Amazon DynamoDB Persistence.
#
# Further details at https://docs.openhab.org/addons/persistence/dynamodb/readme.html
# Further details at https://www.openhab.org/addons/persistence/dynamodb/
#
# PID SETTING
#
# When configuring the persistence using file (instead UI),
# make sure the first line in the configuration file is the
# pid definition (remove the comment prefix #)
#pid:pid:org.openhab.dynamodb
#
# CONNECTION SETTINGS (follow OPTION 1 or OPTION 2)
#
@@ -26,50 +34,58 @@
# OPTION 2 (using profilesConfigFile and profile)
# where profilesConfigFile points to AWS credentials file
# Please note that the user that runs openHAB must have approriate read rights to the credential file.
# See below for an example how the credentials file should look like
#profilesConfigFile=/etc/openhab2/aws_creds
#profile=fooprofile
#region=eu-west-1
# UNCOMMENT THE BELOW ALWAYS (otherwise legacy table schema with 'tablePrefix' is used)
#table=openhab
# Credentials file example:
#
# [fooprofile]
# aws_access_key_id=AKIAIOSFODNN7EXAMPLE
# aws_secret_access_key=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
#
# ADVANCED CONFIGURATION (OPTIONAL)
#
# Expire time for data in days (relative to stored timestamp).
# Data older than this is removed automatically using DynamoDB Time to Live (TTL)
# feature.
#expireDays=
# read capacity for the created tables
#readCapacityUnits=1
# write capacity for the created tables
#writeCapacityUnits=1
# table prefix used in the name of created tables
# LEGACY SCHEMA: table prefix used in the name of created tables
#tablePrefix=openhab-
-->
<parameter name="region" type="text" required="true">
<label>AWS region ID</label>
<description><![CDATA[AWS region ID as described in Step 2 in Setting up Amazon account.<br />
<description><![CDATA[AWS region ID<br />
The region needs to match the region of the AWS user that will access Amazon DynamoDB.<br />
For example, eu-west-1.]]></description>
</parameter>
<parameter name="accessKey" type="text" required="false">
<label>AWS access key</label>
<description><![CDATA[AWS access key of the AWS user that will access Amazon DynamoDB.
<br />
<description><![CDATA[AWS access key<br />
Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description>
</parameter>
<parameter name="secretKey" type="text" required="false">
<label>AWS secret key</label>
<description><![CDATA[AWS secret key of the AWS user that will access Amazon DynamoDB.
<br />
<description><![CDATA[AWS secret key<br />
Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description>
</parameter>
@@ -78,39 +94,54 @@
<parameter name="profilesConfigFile" type="text" required="false">
<label>AWS credentials file</label>
<description><![CDATA[Path to the AWS credentials file. <br />
For example, /etc/openhab2/aws_creds.
Please note that the user that runs openHAB must have approriate read rights to the credential file.
<br />
For example, /etc/openhab/aws_creds. Please note that the user that runs openHAB must have approriate read rights to the credential file. <br />
Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description>
</parameter>
<parameter name="profile" type="text" required="false">
<label>Profile name</label>
<description><![CDATA[Name of the profile to use in AWS credentials file
<br />
<description><![CDATA[Profile name in AWS credentials file. <br />
Give either 1) access key and secret key, or 2) credentials file and profile name.
]]></description>
</parameter>
<parameter name="table" type="text" required="false">
<label>Table</label>
<description><![CDATA[Table name. <br />
Specify this parameter over Table Prefix to use the new optimized table format.]]></description>
<default>openhab</default> <!-- set by default, preferring new schema format -->
</parameter>
<parameter name="readCapacityUnits" type="integer" required="false" min="1">
<description>Read capacity for the created tables. Default is 1.</description>
<label>Read capacity</label>
<description><![CDATA[Provisioned read capacity.<br />
Default is 1.]]></description>
<label>Read Capacity</label>
<advanced>true</advanced>
</parameter>
<parameter name="writeCapacityUnits" type="integer" required="false" min="1">
<label>Write capacity</label>
<description>Write capacity for the created tables. Default is 1.</description>
<label>Write Capacity</label>
<description><![CDATA[Provisioned write capacity.<br />
Default is 1.]]></description>
<advanced>true</advanced>
</parameter>
<parameter name="tablePrefix" type="text" required="false">
<label>Table prefix</label>
<description>Table prefix used in the name of created tables. Default is openhab-</description>
<parameter name="expireDays" type="integer" required="false" min="1">
<label>Data Expiry, in Days</label>
<description><![CDATA[Expire time for data.<br />
Data older than this is automatically removed by DynamoDB Time to Live (TTL) feature. Use empty value to disable data expiration.
]]></description>
<advanced>true</advanced>
<default></default> <!-- empty by default, giving preference to new table schema -->
</parameter>
<parameter name="tablePrefix" type="text" required="false">
<label>Table Prefix</label>
<description><![CDATA[Legacy: Table prefix used in the name of created tables. <br />
Default is "openhab-"]]></description>
<advanced>true</advanced>
<default></default> <!-- empty by default, giving preference to new table schema -->
</parameter>
</config-description>