我想将从json解析的数据批量插入到db中.我使用方法下面插入批处理.问题是mDbWritable.beginTransaction();执行时间太长.通常像6秒!我不知道问题出在哪里.有些想法会导致如此长的执行时间?非常感谢.
@Override public ContentProviderResult[] applyBatch(ArrayList<ContentProviderOperation> operations) throws OperationApplicationException { long start = System.currentTimeMillis(); mDbWritable.beginTransaction(); long time = System.currentTimeMillis() - start; Alog.i(TAG,"Time applyBatch beginTransaction: " + time); final int numOperations = operations.size(); final ContentProviderResult[] results = new ContentProviderResult[numOperations]; try { for (int i = 0; i < numOperations; i++) { results[i] = operations.get(i).apply(this,results,i); } mDbWritable.setTransactionSuccessful(); } finally { mDbWritable.endTransaction(); } return results; }
日志中的一些示例:
11-16 15:14:53.726: I/ApiProvider(21442): Time applyBatch beginTransaction: 6025 11-16 15:15:00.713: I/ApiProvider(21442): Time applyBatch beginTransaction: 4940 11-16 15:15:17.819: I/ApiProvider(21442): Time applyBatch beginTransaction: 8651 11-16 15:15:45.346: I/ApiProvider(21442): Time applyBatch beginTransaction: 12672 11-16 15:16:16.807: I/ApiProvider(21442): Time applyBatch beginTransaction: 12411 11-16 15:16:45.685: I/ApiProvider(21442): Time applyBatch beginTransaction: 12247 11-16 15:17:01.500: I/ApiProvider(21442): Time applyBatch beginTransaction: 12788
编辑:解析json时我在循环中使用apply batch.例如对于json中的每个项目 – 解析并应用批处理.批处理包含插入,更新,删除操作.
Cursor starredChannelsCursor = mContentResolver.query(ApiContract.Channels.CONTENT_URI,new String[] {BaseColumns._ID,ChannelsTable.ID,ChannelsTable.SLUG },ChannelsTable.IS_STARRED + "=?",new String[] { "1" },null); String userName = mSettings.getUserName(); if (starredChannelsCursor != null && starredChannelsCursor.movetoFirst()) { while (!starredChannelsCursor.isAfterLast()) { String channelSlug = starredChannelsCursor.getString(2); ChannelHandler channelHandler = new ChannelHandler(this); URI channelApiUri = Constants.getChannelApiURI(channelSlug,userName); //execute update make applybatch call executeUpdate(channelApiUri,channelHandler); starredChannelsCursor.movetoNext(); } } if (starredChannelsCursor != null) { starredChannelsCursor.close(); } /** * Make call to Uri,parse response and apply batch operations to * contentResolver * * @param apiUri * @param handler * - handles parsing */ private boolean executeUpdate(URI apiUri,AbstractJSONHandler handler) { ApiResponse apiResponse = mHttpHelper.doHttpCall(apiUri); ArrayList<ContentProviderOperation> batch = new ArrayList<ContentProviderOperation>(); if (apiResponse != null) { batch = handler.parse(apiResponse); Alog.v(TAG,"update user data from " + apiUri); } if (batch.size() > 0) { try { mContentResolver.applyBatch(ApiContract.CONTENT_AUTHORITY,batch); } catch (Exception e) { Alog.v(TAG,"Error: " + e.getMessage()); } } return true; }