Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

first draft/experiment with new structure #623

Draft
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

wagmarcel
Copy link
Member

cont

first early good signs

removed duplication of base queries

Add all relationship checks

Started to work on property checks

started cleaning up and removing attribute.id and attribute.index

Add common.test.js

Add test for ConnectionManager

Adding table creation updates

Revert "Add test for ConnectionManager"

This reverts commit 28d42af.

Added inheritance

DatatypeConstraint check

Add minmax

+MaxInclusive

Add Ins constraint

CountConstraint for properties

Added min/maxlength for strings

update kms tests

index/datasetId fix

Updated ngsild table creation and renamed model-example to model-instance

renamed model-example to model-instance

sparql/construct running through

Add superclass checks to sparql query

update kms-tests for sparql-checks

kms test8 corrected

fixing test3 and test4 of kms-rules

adapted test7 in kms-rules

kms-udf tests6 adapted

Updated core-services to remove id, index and work with datasetId

creation of Property and Relationship tables contained bug

first sucessfuly flink deploy

Moved Flink version to 1.19

Add Kafka topics to Property/relationship table

Moved to structure with explicit fields to cover the special case of deletion

working except min/maxvalues

continued using deleted structures and removed deleted from attributes-insert tables

working with all property and relationship checks

grouping of edeleted

Add deleted to attributes table

Updated CountConstraint calculation

progressing slowly, the combination of deleted and types needs still some finetuning. Also, there seems to be a problem with the ngsild-update bridge or scorpio or both

Updated checkstable to be upsert-kafka, added TRY_CATCH, updated debug info

Updated sparql creation to fix WC2 bug

kms test run through now!

Moved Flink version to 1.19

Update debezium bridge

update diffAttribute function

DebeziumBridge linting

Fix testTimescaleDB unit tests

Fix make deploy-flink only builds once

Beamservicesoperator: Check first of name of job already exists. if yes, adopt it

Add labels to rdf-maps to make sure they can be deleted cleanly

linting

linting

linting

linting

introduced 'synched' field in attributes to avoid forwarding already synched valued to scorpio

Updated core-services and core-tables

fixed create_ngsild_tables tests

Fix some unit tests

Added unit tests to reach 80%

linting

Fixed DEbeziumBridge tests

Updated flink debug notebooks

Update semantic-model part

Update debezium bridge

Update .github pipeline

github workflow

start integration of value/object unification, unitcode, parentId, etc

corrected order of attributes

Updated kms tests to reflect new simplified table

Working on debezium bridge recursive parsing

Revert "Working on debezium bridge recursive parsing"

This reverts commit 174dadd.

Working on core-table

Flink sql operator with the new check-name before deploy policy - needs still unit tests

linting and unit tests

Add test to reach 80%

linting

cont

first early good signs

removed duplication of base queries

Add all relationship checks

Started to work on property checks

started cleaning up and removing attribute.id and attribute.index

Add common.test.js

Add test for ConnectionManager

Adding table creation updates

Revert "Add test for ConnectionManager"

This reverts commit 28d42af.

Added inheritance

DatatypeConstraint check

Add minmax

+MaxInclusive

Add Ins constraint

CountConstraint for properties

Added min/maxlength for strings

update kms tests

index/datasetId fix

Updated ngsild table creation and renamed model-example to model-instance

renamed model-example to model-instance

sparql/construct running through

Add superclass checks to sparql query

update kms-tests for sparql-checks

kms test8 corrected

fixing test3 and test4 of kms-rules

adapted test7 in kms-rules

kms-udf tests6 adapted

Updated core-services to remove id, index and work with datasetId

creation of Property and Relationship tables contained bug

first sucessfuly flink deploy

Moved Flink version to 1.19

Add Kafka topics to Property/relationship table

Moved to structure with explicit  fields to cover the special case of deletion

working except min/maxvalues

continued using deleted structures and removed deleted from attributes-insert tables

working with all property and relationship checks

grouping of edeleted

Add deleted to attributes table

Updated CountConstraint calculation

progressing slowly, the combination of deleted and types needs still some finetuning. Also, there seems to be a problem with the ngsild-update bridge or scorpio or both

Updated checkstable to be upsert-kafka, added TRY_CATCH, updated debug info

Updated sparql creation to fix WC2 bug

kms test run through now!

Moved Flink version to 1.19

Update debezium bridge

update diffAttribute function

DebeziumBridge linting

Fix testTimescaleDB unit tests

Fix make deploy-flink only builds once

Beamservicesoperator: Check first of name of job already exists. if yes, adopt it

Add labels to rdf-maps to make sure they can be deleted cleanly

linting

linting

linting

linting

introduced 'synched' field in attributes to avoid forwarding already synched valued to scorpio

Updated core-services and core-tables

fixed create_ngsild_tables tests

Fix some unit tests

Added unit tests to reach 80%

linting

Fixed DEbeziumBridge tests

Updated flink debug notebooks

Update semantic-model part

Update debezium bridge

Update .github pipeline

github workflow

start integration of value/object unification, unitcode, parentId, etc

corrected order of attributes

Updated kms tests to reflect new simplified table

Working on debezium bridge recursive parsing

Revert "Working on debezium bridge recursive parsing"

This reverts commit 174dadd.

Working on core-table

Flink sql operator with the new check-name before deploy policy - needs still unit tests

linting and unit tests

Add test to reach 80%

linting
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant