Using Lombok to Simplify the Code

The Problem
We use configuration-driven approach, that means we store a lot of configuration data in our database, so we can change the configuration at runtime.

To learn more about the feature service please check Building Troubleshooting Friendly Application - Using Feature Toggle.

We store the configuration data as json string in database,  please check Merge JSON Objects: Jackson + BeanUtils.copyProperties to learn how to use jackson and spring BeanUtils to merge two objects.

But one issue is that the class is getting bigger and bigger: it contains fields, their getter, setters, toString, hashCode, equals and extractXXX methods which returns Nonnull value - it read value from database, if not exists, either return coded default value or default value from from property file.

The Solution
Lombok comes to rescue. All we need is to add the @Data annotation, it will generate getter, setter, toString, hashCode, equals. This reduces lines of the class from 300+ to about 100.

Also we use SpringContextBridge to read default value from property file if the field isn't set in db.
@Data
public class SimpleConfig implements Serializable {
  private Boolean xxAsync;
  // more fields ...
  @JsonIgnore 
  //this will first read from db, if not set, read from property file, if still not set, return default hard-coded value
  public boolean extractXXAsync() {
      return MoreObjects.firstNonNull(xxAsyn,
              SpringContextBridge.getProperty("xx.xxAsync", true, Boolean.class));
  }
}

@Component
public class SpringContextBridge implements ApplicationContextAware {
    private static ApplicationContext applicationContext;
    @Override
    public void setApplicationContext(final ApplicationContext applicationContext) throws BeansException {
        SpringContextBridge.applicationContext = applicationContext;
    }
    public static <T> T getBean(final Class<T> serviceType) {
        return applicationContext.getBean(serviceType);
    }
    public static <T> T getProperty(final String key, final T defaultValue, final Class<T> targetType) {
        return MoreObjects.firstNonNull(applicationContext.getEnvironment().getProperty(key, targetType), defaultValue);
    }
}
Configuration and Install
Add lombok into project's pom.xml. Then go to run java -jar .m2\repository\org\projectlombok\lombok\latest\lombok-latest.jar, install lombok to eclipse you are using. 
You may need run eclipse clean: eclipse4.5/Eclipse.app/Contents/MacOS/eclipse -clean

Lombok features
@Data
@Accessors(chain = true)
Make field readonly
@Setter(AccessLevel.NONE)
@EqualsAndHashCode(callSuper = true)
@Value
@Builder
@NonNull
@Synchronized
synchronized done right: Don't expose your locks.
@Log

@Cleanup

Spring Security - Build Multi-Tenant Application

The problem
We are evolving our application from single-purpose to multi-tenant application.

The solution
We use LDAP to authenticate user and define different LDAP Group for different roles in different environment for different sub-application.

In login page, user selects what sub-applications to login. The application will call LDAP to do authentication, which will return what what groups user belongs to. Then the application will check the group-mapping to decide whether user can access this application and what roles user should have.

We also store the sub-application name in the session, so it can be used later.


We store supported Applications - the application name and the mapping of application's ldap groups in database.

Check Spring Security: Integrate In-Memory Authentication for Test Automation for why we add test users in dev lines and how to do it.


Talk is cheap. Show me the code.
@Component
public class MyUsernamePasswordAuthenticationFilter extends UsernamePasswordAuthenticationFilter {
 @Autowired
 private Environment environment;
 @Autowired
 private IConfigService configService;

 @Autowired
 private ApplicationProfile applicationProfile;
  // these test users are cross all applications in dev lines
 private final Set$lt;String$gt; testUsers = new HashSet$lt;$gt;();

 @PostConstruct
 public void postConstruct() {
  if (applicationProfile.isDev()) {
   addTestUser("spring.security.test.user.adminOnly.name");
   addTestUser("spring.security.test.user.provisionerOnly.name");
   addTestUser("spring.security.test.user.adminProvisioner.name");
  }
 }

 protected void addTestUser(final String testUserProperty) {
  final String testUser = environment.getProperty(testUserProperty);
  if (StringUtils.isNotBlank(testUser)) {
   testUsers.add(testUser);
  }
 }

 @Autowired
 @Override
 public void setAuthenticationManager(final AuthenticationManager authenticationManager) {
  super.setAuthenticationManager(authenticationManager);
 }

 @Override
 public Authentication attemptAuthentication(final HttpServletRequest request, final HttpServletResponse response)
   throws AuthenticationException {
  final String applicationName = request.getParameter(Util.APPLICATION_NAME);

  if (StringUtils.isEmpty(applicationName)) {
   throw new AuthenticationServiceException(
     MessageFormat.format("Not supported application: {0}", applicationName));
  }

  final Map$lt;String, SupportedAppSecurityConfig$gt; supportedApps = configService.getMySimpleConfig()
    .extractSupportedApplications();
  if (!supportedApps.containsKey(applicationName)) {
   throw new AuthenticationServiceException(
     MessageFormat.format("Not supported application: {0}", applicationName));
  }

  final Authentication auth = super.attemptAuthentication(request, response);

  if (auth.isAuthenticated()) {
   request.getSession(true).setAttribute(Util.APPLICATION_NAME, applicationName);
   if (testUsers.contains(auth.getName())) {
    return auth;
   }
   return checkAuthorizationAndMappingGroup(supportedApps, applicationName, auth);
  }
  return auth;
 }

 protected Authentication checkAuthorizationAndMappingGroup(
   final Map$lt;String, SupportedAppSecurityConfig$gt; supportedApps, final String applicationName,
   final Authentication auth) {
  // mapping group
  final SupportedAppSecurityConfig application = supportedApps.get(applicationName);

  final List$lt;GrantedAuthority$gt; newAuthorities = new ArrayList$lt;$gt;();

  boolean isAdmin = false, isProvisioner = false;
  for (final GrantedAuthority authority : auth.getAuthorities()) {
   if (authority.getAuthority().equals(application.getAdminLadpGroup())) {
    isAdmin = true;
   }
   if (authority.getAuthority().equals(application.getProvisionLdapGroup())) {
    isProvisioner = true;
   }
  }

  if (!isAdmin && !isProvisioner) {
   throw new AuthenticationServiceException(MessageFormat
     .format("User {0} does not have expected authority, having: {1}", auth.getName(), newAuthorities));
  }

  if (isAdmin) {
   newAuthorities.add(new SimpleGrantedAuthority(Util.ADMIN_GROUP));
  }
  if (isProvisioner) {
   newAuthorities.add(new SimpleGrantedAuthority(Util.PROVISION_GROUP));
  }

  final Authentication newAuth = new UsernamePasswordAuthenticationToken(auth.getPrincipal(),
    auth.getCredentials(), newAuthorities);
  return newAuth;
 }
}

@Configuration
@EnableWebSecurity
@EnableGlobalMethodSecurity(prePostEnabled = true)
public class MyWebSecurityConfiguration extends WebSecurityConfigurerAdapter {
      @Autowired
      private MyUsernamePasswordAuthenticationFilter usernamePasswordAuthenticationFilter;
      @Override
      protected void configure(final HttpSecurity http) throws Exception {
          http.authorizeRequests()
          .antMatchers("/* ignored*/").permitAll()
          .antMatchers("/* ignored*/").access(Util.ROLE_PROVISIONER_OR_ADMIN)
          .antMatchers("/* ignored*/").access(Util.ROLE_ADMIN)
          .and().formLogin().loginPage("/login").failureUrl("/loginerror")
          .loginProcessingUrl("/j_spring_security_check").passwordParameter("j_password")
          .usernameParameter("j_username").defaultSuccessUrl("/index.html").and().logout()
          .logoutUrl("/j_spring_security_logout").logoutSuccessUrl("/loggedout")
          .deleteCookies("JSESSIONID", "SESSION")
          .and().sessionManagement().sessionFixation().migrateSession().maximumSessions(1)
          .and().and().addFilter(usernamePasswordAuthenticationFilter);
      }
      // check http://lifelongprogrammer.blogspot.com/2016/04/spring-security-integrate-in-memory.html
      // for implementation
      @Bean @Override
      public AuthenticationManager authenticationManagerBean() throws Exception {}
}

public class SupportedAppSecurityConfig implements Serializable {
    private static final long serialVersionUID = 1L;
    private String name;
    private String adminLadpGroup;
    private String provisionLdapGroup;
}

Using in-memory Embedded Solr

The problem
We store data in solr cloud. As application evolves from single-suppose message application to multiple-tenant message applications, there is much more traffic to our application. 

The traffic is high, but the data is small. - As we don't have many active messages at specific time.

To boost search performance, we decide to use in memory embeddedSolr. 


The Solution
Admin application still use CloudSolrRepositery to write data into solr cloud.

Client-facing application periodically deletes expired data from embeddedSolr and copies (only) updated/new data from solr cloud to embeddedSolr. 

We change server code to use the EmbeddedSolrRepositery. - So there is only little change to existing code.

DataSyncService.copyMessagesFromSolrCloudToEmbeddedSolr deletes expired data from embedded Solr and then copies (only) updated/new data from solr cloud to embeddedSolr. - it ignores data that already exists(with same id and _version_ values).

It's already be called in the ContextLoaderListener so it copies all data from solrcloud to embedded Solr before application startup finishes.

It's also a scheduled task - it will be called periodically.
Here we use SchedulingConfigurer - not @Scheduled because we want to make the interval configurable and changeable. - @Scheduled only supports read value from property file, but doesn't support to call bean method.

Also admin application can change the configuration to enable/disable embedded solr and change the frequency of sync.

Talk is cheap. Show me the code.
@Service(MessageCloudRepository.NAME)
public class MessageCloudRepository extends AbstractMessageRepository {
    public static final String NAME = "MessageCloudRepository";
    @Autowired
    @Qualifier(RestCommonsAppConfig.BEAN_SOLR_CLOUD)
    private SolrClient cloudSolrServer;
    @Override
    public SolrClient getSolrServer() {
        return cloudSolrServer;
    }
}

@Service(MessageEmbeddedRepository.NAME)
public class MessageEmbeddedRepository extends AbstractMessageRepository {
    public static final String NAME = "MessageEmbeddedRepository";
    @Autowired
    @Qualifier(RestCommonsAppConfig.BEAN_EMBEDDED_MESSAGE_)
    private SolrClient embeddedSolrServer;

    @Override
    public SolrClient getSolrServer() {
        return embeddedSolrServer;
    }
}

@Service
public class DataSyncService {
    @Autowired
    @Qualifier(MessageEmbeddedRepository.NAME)
    private IMessageRepository embeddedRepository;
    @Autowired
    private @Qualifier(MessageCloudRepository.NAME) IMessageRepository cloudRepository;
    @Autowired
    private IConfigService configService;

    public void copyMessagesFromSolrCloudToEmbeddedSolr() {
        if (!configService.isEmbeddedMessageSolrEnabled()) {
            return;
        }
        deleteExpiredDataFromEmbeddedSolr();

        final String query = ""; // the query to get new active data
        final SolrQuery solrQuery = new SolrQuery(query);
        // but add filter to ignore data already in embeddedSolr with same id and _version_
        ignoreExistingData(solrQuery);

        final List<Future<List<Message>>> messagesFutures = cloudRepository.findAllAsync(solrQuery);
        if (CollectionUtils.isNotEmpty(messagesFutures)) {
            for (final Future<List<Message>> messagesFuture : messagesFutures) {
                try {
                    final List<Message> messages = messagesFuture.get().stream().map(message -> {
                        message.setVersionFromSolrCloud(message.getVersion());
                        return message;
                    }).collect(Collectors.toList());
                    embeddedRepository.saveWithoutCommit(messages);
                } catch (InterruptedException | ExecutionException e) {
                    logger.error("Failed to copy data from solr cloud to embedded solr.", e);
                }
            }

            embeddedRepository.hardCommit();
        }
    }
    /**
     * ignore data that is already in embedded solr and with same id and _version_.
     */
    protected void ignoreExistingData(final SolrQuery solrQuery) {
        final SolrQuery existingDataQuery = new SolrQuery("*:*").setFields(Abstract.FIELD_ID,
                Message.FIELD_VERSION_FROM_SOLR_CLOUD);

        final Iterable<Message> existingMessages = embeddedRepository.findAllSync(existingDataQuery);
        // NOT ((id:id1 AND _version_:v1) OR (id:id2 AND _version_:v2))
        final Iterator<Message> it = existingMessages.iterator();
        if (it.hasNext()) {
            final StringBuilder sb = new StringBuilder();
            while (it.hasNext()) {
                final Message message = it.next();
                sb.append(MessageFormat.format("({0}:{1} AND {2}:{3,number,#})", Abstract.FIELD_ID,
                        message.getId(), AbstractSolrDocument.FIELD_VERSION_,
                        message.getVersionFromSolrCloud()));

                if (it.hasNext()) {
                    sb.append(SolrUtil.SEPERATOR_OR);
                }
            }
            solrQuery.addFilterQuery(MessageFormat.format("{0}({1})", SolrUtil.NOT, sb.toString()));
        }
    }
}

@Configuration
@EnableScheduling
public class ScheduledTaskConfig implements SchedulingConfigurer {
    @Autowired
    private DataSyncService dataSyncService;
    @Autowired
    private IConfigService configService;
    @Bean(destroyMethod = "shutdown")
    public Executor taskExecutor() {
        return Executors.newScheduledThreadPool(10);
    }
    @Override
    public void configureTasks(final ScheduledTaskRegistrar taskRegistrar) {
        taskRegistrar.setScheduler(taskExecutor());
        taskRegistrar.addTriggerTask(new Runnable() {
            @Override
            public void run() {
                dataSyncService.copyMessagesFromSolrCloudToEmbeddedSolr();
            }
        }, new Trigger() {
            @Override
            public Date nextExecutionTime(final TriggerContext triggerContext) {
                final Calendar nextExecutionTime = new GregorianCalendar();
                final Date lastActualExecutionTime = triggerContext.lastActualExecutionTime();
                nextExecutionTime.setTime(lastActualExecutionTime != null ? lastActualExecutionTime : new Date());

                nextExecutionTime.add(Calendar.MILLISECOND,
                        configService.getSimpleConfig().extractSyncMessageToEmbeddedSolrIntervalInMill());
                return nextExecutionTime.getTime();
            }
        });
    }
}

Solr - Create custom data transformer to remove fields

Overview
Create custom data transformer to remove fields and remove field from json data in Solr.

The Problem
We store campaign message in Solr. One type of campaign is voucher. We return this user's voucher and other data based on user's accountId.

To support this, we add one searchable field: accountIds which includes all accountIds for this campaign. Add another field: details field which is a json string (mapping to java class) and non-searchable. It includes vouchers properties - a mapping from accountId to voucherCode.
-- We choose this approach to be consistent with existing data and make server code simpler.

accountIds and details.vouchers fields are big, and when return to client, actually we only need c this user's voucherCode.

The Solution
Excluding one field
We can build only fl to only include all fields except accountIds.  - This is kind of cumbersome, and every time we add a new field, we have to change the fl in SolrQuery.

[SOLR-3191] field exclusion from fl is promising, but it's not merged into Solr release.

So we create a data transformer which supports the following params:
removeFields - what fields to remove
Example: removeFields=accountIds,field1,field2

removeOthersVoucher - enable the feature if it's true
If removeOthersVoucher is true:
If accountId is empty, then remove all voucherCodes from details field.
If accountId is not empty, then remove all voucherCodes except accountId's voucher.

How to uses it
fl=*,[removeFeilds]&removeFields=accountIds&removeOthersVoucher=true&accountId=account1

Writing Custom Data Transformer
We use Jackson ObjectMapper to deserialize details field from String to  Map<String, Object>.

public class  MyTransformerFactory extends TransformerFactory {
    protected static Logger logger = LoggerFactory.getLogger(MyTransformerFactory.class);
    private boolean enabled = false;

    @Override
    public void init(@SuppressWarnings("rawtypes") final NamedList args) {
        try {
            super.init(args);
            if (args != null) {
                final SolrParams params = SolrParams.toSolrParams(args);
                enabled = SolrParams.toSolrParams(args).getBool("enabled", true);
            }
        } catch (final Exception e) {
            logger.error("MyTransformerFactory init failed", e);
        }
    }

    @Override
    public DocTransformer create(final String field, final SolrParams params, final SolrQueryRequest req) {
        final SolrParams reqParams = req.getParams();
        final String removeFields = reqParams.get("removeFields");
        final boolean removeOthersVoucher = reqParams.getBool("removeOthersVoucher", false);
        final String accountId = reqParams.get("accountId");
        if (!enabled || (removeFields == null && !removeOthersVoucher)) {
            return null;
        }
        return new  MyTransformer(removeFields, removeOthersVoucher, accountId);
    }

    private static class  MyTransformer extends DocTransformer {
        private static final String FIELD_DETAILS = "details";
        private static final String DETAIL_VOUCHER_CODES = "voucherCodes";

        private static ObjectMapper objectMapper = new ObjectMapper();
        private static Splitter splitter = Splitter.on(",").trimResults();

        private final String removeFields;
        private final boolean removeOthersVoucher;
        private final String accountId;

        public  MyTransformer(final String removeFields, final boolean removeOthersVoucher, final String accountId) {
            this.removeFields = removeFields;
            this.removeOthersVoucher = removeOthersVoucher;
            this.accountId = accountId;
        }
        @Override
        public String getName() {
            return  MyTransformer.class.getSimpleName();
        }

        @Override
        public void transform(final SolrDocument doc, final int docid) throws IOException {
            if (removeFields != null) {
                final Iterable<String> it = splitter.split(removeFields);
                for (final String removeField : it) {
                    doc.removeFields(removeField);
                }
            }
            try {
                if (removeOthersVoucher) {
                    removeOthersVoucher(doc);
                }
            } catch (final Exception e) {
                // ignore it if there is exception
                logger.error("MyTransformer transform failed", e);
            }
        }

        protected void removeOthersVoucher(final SolrDocument doc)
                throws IOException, JsonParseException, JsonMappingException, JsonProcessingException {
            final String detailsObj = getFieldValue(doc, FIELD_DETAILS);
            if (detailsObj == null) {
                return;
            }

            final Map<String, Object> detailsMap = objectMapper.readValue(detailsObj.toString(),
                    TypeFactory.defaultInstance().constructMapType(Map.class, String.class, Object.class));
            if (detailsMap == null) {
                return;
            }
            final Object voucherCodesObj = detailsMap.get(DETAIL_VOUCHER_CODES);
            if (!(voucherCodesObj instanceof HashMap)) {
                return;
            }
            final Map<String, String> voucherCodesMap = (Map<String, String>) voucherCodesObj;
            final String voucherCode = voucherCodesMap.get(accountId);

            final Map<String, String> myVoucherMap = new HashMap<String, String>();
            if (voucherCode != null) {
                myVoucherMap.put(accountId, voucherCode);
            }
            detailsMap.put(DETAIL_VOUCHER_CODES, myVoucherMap);

            doc.setField(FIELD_DETAILS, objectMapper.writeValueAsString(detailsMap));
        }
    }

    public static String getFieldValue(final SolrDocument doc, final String field) {
        final List<String> rst = new ArrayList<String>();
        final Object obj = doc.get(field);
        getFieldvalues(doc, rst, obj);

        if (rst.isEmpty()) {
            return null;
        }
        return rst.get(0);
    }

    public static void getFieldvalues(final SolrDocument doc, final List<String> rst, final Object obj) {
        if (obj == null) {
            return;
        }
        if (obj instanceof org.apache.lucene.document.Field) {
            final org.apache.lucene.document.Field field = (Field) obj;
            final String oldValue = field.stringValue();
            if (oldValue != null) {
                rst.add(oldValue);
            }
        } else if (obj instanceof IndexableField) {
            final IndexableField field = (IndexableField) obj;
            final String oldValue = field.stringValue();
            if (oldValue != null) {
                rst.add(oldValue);
            }
        } else if (obj instanceof Collection) {
            final Collection colls = (Collection) obj;
            for (final Object newObj : colls) {
                getFieldvalues(doc, rst, newObj);
            }
        } else {
            logger.error(MessageFormat.format("type: {0}", obj.getClass()));
            rst.add(obj.toString());
        }
    }
}
Add Transformer into solrConfig.xml

<lib dir="../../../lib" regex="lifelongprogrammer-solr-extension-jar-with-dependencies.jar" />

<transformer name="removeFeilds" class="com.lifelongprogrammer.solr.MyTransformerFactory">
  <bool name="enabled">true</bool>
</transformer>

pom.xml - Build solr-extension jar
We declare scope of solr-core as provided and use maven-assembly-plugin to build jar-with-dependencies.

<build>
  <finalName>lifelongprogrammer-solr-extension</finalName>
  <plugins>
    <plugin>
      <artifactId>maven-assembly-plugin</artifactId>
      <version>2.6</version>
      <configuration>
        <descriptorRefs>
          <descriptorRef>jar-with-dependencies</descriptorRef>
        </descriptorRefs>
      </configuration>
      <executions>
        <execution>
          <id>make-assembly</id>
          <phase>package</phase>
          <goals>
            <goal>single</goal>
          </goals>
        </execution>
      </executions>
    </plugin>
  </plugins>
</build>

<dependencies>
  <dependency>
    <groupId>org.apache.solr</groupId>
    <artifactId>solr-core</artifactId>
    <version>5.2.0</version>
    <scope>provided</scope>
  </dependency>
  <dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-databind</artifactId>
    <version>2.7.4</version>
  </dependency>
</dependencies>

Labels

Java (159) Lucene-Solr (110) Interview (61) All (58) J2SE (53) Algorithm (45) Soft Skills (37) Eclipse (33) Code Example (31) Linux (24) JavaScript (23) Spring (22) Windows (22) Web Development (20) Nutch2 (18) Tools (18) Bugs (17) Debug (16) Defects (14) Text Mining (14) J2EE (13) Network (13) Troubleshooting (13) PowerShell (11) Chrome (9) Design (9) How to (9) Learning code (9) Performance (9) Problem Solving (9) UIMA (9) html (9) Http Client (8) Maven (8) Security (8) bat (8) blogger (8) Big Data (7) Continuous Integration (7) Google (7) Guava (7) JSON (7) ANT (6) Coding Skills (6) Database (6) Scala (6) Shell (6) css (6) Algorithm Series (5) Cache (5) Dynamic Languages (5) IDE (5) Lesson Learned (5) Programmer Skills (5) System Design (5) Tips (5) adsense (5) xml (5) AIX (4) Code Quality (4) GAE (4) Git (4) Good Programming Practices (4) Jackson (4) Memory Usage (4) Miscs (4) OpenNLP (4) Project Managment (4) Spark (4) Testing (4) ads (4) regular-expression (4) Android (3) Apache Spark (3) Become a Better You (3) Concurrency (3) Eclipse RCP (3) English (3) Happy Hacking (3) IBM (3) J2SE Knowledge Series (3) JAX-RS (3) Jetty (3) Restful Web Service (3) Script (3) regex (3) seo (3) .Net (2) Android Studio (2) Apache (2) Apache Procrun (2) Architecture (2) Batch (2) Bit Operation (2) Build (2) Building Scalable Web Sites (2) C# (2) C/C++ (2) CSV (2) Career (2) Cassandra (2) Distributed (2) Fiddler (2) Firefox (2) Google Drive (2) Gson (2) How to Interview (2) Html Parser (2) Http (2) Image Tools (2) JQuery (2) Jersey (2) LDAP (2) Life (2) Logging (2) Python (2) Software Issues (2) Storage (2) Text Search (2) xml parser (2) AOP (1) Application Design (1) AspectJ (1) Chrome DevTools (1) Cloud (1) Codility (1) Data Mining (1) Data Structure (1) ExceptionUtils (1) Exif (1) Feature Request (1) FindBugs (1) Greasemonkey (1) HTML5 (1) Httpd (1) I18N (1) IBM Java Thread Dump Analyzer (1) JDK Source Code (1) JDK8 (1) JMX (1) Lazy Developer (1) Mac (1) Machine Learning (1) Mobile (1) My Plan for 2010 (1) Netbeans (1) Notes (1) Operating System (1) Perl (1) Problems (1) Product Architecture (1) Programming Life (1) Quality (1) Redhat (1) Redis (1) Review (1) RxJava (1) Solutions logs (1) Team Management (1) Thread Dump Analyzer (1) Visualization (1) boilerpipe (1) htm (1) ongoing (1) procrun (1) rss (1)

Popular Posts