架构中未找到休眠表

问题描述

我有一个问题,我有几个微服务,但是其中一个抛出了一个例外,其他人却没有抛出并无法完美工作...

[2020-09-28 16:55:38.304]|ERROR|TIBCO EMS Session dispatcher (21297)|org.hibernate.engine.jdbc.spi.sqlExceptionHelper.logExceptions^[[36m(142)^[[0;39m: --- ERROR: relation "computed.fluxeventlogging" does not exist
  Position: 502
[2020-09-28 16:55:38.307]|INFO |TIBCO EMS Session dispatcher (21297)|org.hibernate.event.internal.DefaultLoadEventListener.doOnLoad^[[36m(116)^[[0;39m: --- HHH000327: Error performing load command
org.hibernate.exception.sqlGrammarException: Could not extract ResultSet
        at org.hibernate.exception.internal.sqlStateConversionDelegate.convert(sqlStateConversionDelegate.java:103) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
        at org.hibernate.exception.internal.StandardsqlExceptionConverter.convert(StandardsqlExceptionConverter.java:42) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]

在这样的数据库中拥有良好的权限:

enter image description here

我的实体位于所有微服务使用的通用项目中,并添加了maven依赖项。

@Entity
@Data
@Table(name="fluxeventlogging",schema = "computed")
@IdClass(FluxEventLoggingIdEntity.class)
public class FluxEventLoggingEntity implements Serializable {
    @Id
    @Column(name = "fluxeventuuid",columnDeFinition = "uuid")
    private UUID   fluxEventUuid;
    @Column(name = "lastupdatedate")
    private Instant lastUpdateDate;
    @Column(name = "businessfluxtype")
    private String businessFluxType;
    @Column(name = "fluxprocessortype")
    private String fluxProcessorType;
    @Column(name = "valuewhichcauseupdate")
    private String valueWhichCauseUpdate;
    @Column(name = "oldvaluecause")
    private String oldValueCause;
    @Column(name = "newvaluecause")
    private String newValueCause;
    @Column(name = "currenteventlifestate")
    private String currentEventLifeState;
    @Column(name = "nexteventlifestate")
    private String nextEventLifeState;
    @Column(name = "generatederror",columnDeFinition = "text")
    private String generatedError;
}

但是我将休眠版本的pom固定为:

 <dependency>
        <groupId>org.hibernate</groupId>
        <artifactId>hibernate-core</artifactId>
        <version>5.4.12.Final</version>
 </dependency>

因为该项目是唯一一次将一个数据源用于jpa的项目,

custom.datasource.url: jdbc:postgresql://localhost:5432/postgres
custom.datasource.database: postgres
custom.datasource.driver: pool
custom.datasource.protocol: postgres
custom.datasource.localhost: localhost
custom.datasource.port: 5432
custom.datasource.password: postgres
custom.datasource.username: postgres
custom.datasource.driverclassname: org.postgresql.Driver

    spring:
      jpa:
        properties:
          hibernate:
            dialect: org.hibernate.dialect.Postgresql82Dialect
        hibernate:
          ddl-auto: create
        show-sql: false
        database-platform: org.hibernate.dialect.PostgresqlDialect
      datasource:
        driver-class-name: org.postgresql.Driver
        url: jdbc:postgresql://localhost:5432/postgres
        username: postgres
        password: postgres
        schema: classpath:/schema.sql
        initialization-mode: always
      r2dbc:
        url: r2dbc:postgresql://postgres:postgres@localhost:5432/postgres
        pool:
          enabled: true
          initial-size=: 00
          max-size: 500
          max-idle-time: 30m
          validation-query: SELECT 1

r2dbc的另一个数据源(postgres反应性)

@Configuration
@Slf4j
public class DatabaseConfiguration {

    private static Map<String,Object> PROPERTIES;

    @Autowired
    DataSource dataSource;

    @Bean
    public ConnectionFactory r2dbcConnectionFactory() {
        if(PROPERTIES == null) {
            Yaml yaml = new Yaml();
            InputStream inputStream = this.getClass()
                    .getClassLoader()
                    .getResourceAsstream("application.yml");
            PROPERTIES = yaml.load(inputStream);
        }
        log.info("Init rd2dbc with host: {}",PROPERTIES.get("custom.datasource.localhost").toString());
        log.info("Init rd2dbc with port: {}",PROPERTIES.get("custom.datasource.port").toString());
        log.info("Init rd2dbc with database: {}",PROPERTIES.get("custom.datasource.database").toString());
        log.info("Init rd2dbc with username: {}",PROPERTIES.get("custom.datasource.username").toString());
        log.info("Init rd2dbc with driver: {}",PROPERTIES.get("custom.datasource.driver").toString());
        log.info("Init rd2dbc with protocol: {}",PROPERTIES.get("custom.datasource.protocol").toString());
    ConnectionFactoryOptions options = ConnectionFactoryOptions.builder()
            .option(ConnectionFactoryOptions.DRIVER,PROPERTIES.get("custom.datasource.driver").toString())
            .option(ConnectionFactoryOptions.PROTOCOL,PROPERTIES.get("custom.datasource.protocol").toString())
            .option(ConnectionFactoryOptions.USER,PROPERTIES.get("custom.datasource.username").toString())
            .option(ConnectionFactoryOptions.PASSWORD,PROPERTIES.get("custom.datasource.password").toString())
            .option(ConnectionFactoryOptions.HOST,PROPERTIES.get("custom.datasource.localhost").toString())
            .option(ConnectionFactoryOptions.PORT,Integer.parseInt(PROPERTIES.get("custom.datasource.port").toString()))
            .option(ConnectionFactoryOptions.DATABASE,PROPERTIES.get("custom.datasource.database").toString())
            .build();
        return ConnectionFactories.get(options);
        //return ConnectionFactories.get(ConnectionFactoryOptions.parse(PROPERTIES.get("custom.r2dbc.url").toString()));
    }

    @Bean
    public DataSource getDataSource() {
        if(PROPERTIES == null) {
            Yaml yaml = new Yaml();
            InputStream inputStream = this.getClass()
                    .getClassLoader()
                    .getResourceAsstream("application.yml");
            PROPERTIES = yaml.load(inputStream);
        }
        log.info("Init datasource with url: {}",PROPERTIES.get("custom.datasource.url").toString());
        log.info("Init datasource with username: {}",PROPERTIES.get("custom.datasource.username").toString());
        log.info("Init datasource with driver: {}",PROPERTIES.get("custom.datasource.driverclassname").toString());
        DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
        dataSourceBuilder.url(PROPERTIES.get("custom.datasource.url").toString());
        dataSourceBuilder.username(PROPERTIES.get("custom.datasource.username").toString());
        dataSourceBuilder.password(PROPERTIES.get("custom.datasource.password").toString());
        dataSourceBuilder.driverClassName(PROPERTIES.get("custom.datasource.driverclassname").toString());
        return dataSourceBuilder.build();
    }

    @Bean
    public LocalContainerEntityManagerfactorybean entityManagerFactory() {
        //JpavendorAdapteradapter can be autowired as well if it's configured in application properties.
        HibernateJpavendorAdapter vendorAdapter = new HibernateJpavendorAdapter();
        vendorAdapter.setGenerateDdl(false);

        LocalContainerEntityManagerfactorybean factory = new LocalContainerEntityManagerfactorybean();
        factory.setJpavendorAdapter(vendorAdapter);
        //Add package to scan for entities.
        factory.setPackagesToScan("fr.microservice2.database","fr.microservice.common");
        factory.setDataSource(dataSource);
        return factory;
    }

    @Bean
    public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
        JpaTransactionManager txManager = new JpaTransactionManager();
        txManager.setEntityManagerFactory(entityManagerFactory);
        return txManager;
    }

我不知道为什么只有这个微服务找不到我的表computed.fluxeventlogging ... 虽然在此图中我还有很多其他表格,但这些表格都没有提出问题

请问有人有主意吗? 谢谢你,最好的问候

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)