Webservices method saveItems can't save large files

We have an application uses webservices API saveItems(req) to save content Items. For certain reason it fails if a binary field in PSItem is larger than 12MB. Is there anyway to increase the size limit?

Thanks,
Weikai

There are a couple of places you’ll want to check for this. The web service methods have the same validators and pre/post processing extensions on them that the regular content editor has, so you want to check and change the max_file size limit on the affected binary field in the workbench if using the sys_File control or image controls. Basically if you can’t upload it with the editor manually, you won’t be able to through the web services API.

If I remember it right, the other place to check is in the Rhythmyx/AppServer/server/rx/deploy/rxapp.ear/rxapp.war/WEB-INF/web.xml file. Look for the maxFileSize parameter.

-n

[QUOTE=weikai;20635]We have an application uses webservices API saveItems(req) to save content Items. For certain reason it fails if a binary field in PSItem is larger than 12MB. Is there anyway to increase the size limit?

Thanks,
Weikai[/QUOTE]

Nate,

Thank you for the suggestions. I was able to upload and save the file correctly in content explorer so the problem is not on the server. Is content explorer using the same webservices API calls?

For certain reason, after I restarted the server, I was able to save the large file (12MB) but the file was corrupted and became 8MB. It looked like both loadItems and saveItems methods in webservices (server since it’s using RMI) have problems when handling large files. LoadItems will eat all available Heap and get out of memory error when loading a 116MB item. I have tried to increase the Heap to 2G but it did not help at all.

[QUOTE=natechadwick;20637]There are a couple of places you’ll want to check for this. The web service methods have the same validators and pre/post processing extensions on them that the regular content editor has, so you want to check and change the max_file size limit on the affected binary field in the workbench if using the sys_File control or image controls. Basically if you can’t upload it with the editor manually, you won’t be able to through the web services API.

If I remember it right, the other place to check is in the Rhythmyx/AppServer/server/rx/deploy/rxapp.ear/rxapp.war/WEB-INF/web.xml file. Look for the maxFileSize parameter.

-n[/QUOTE]

Hi Weikai,

What version and patch level are you running? Also, is it 64 or 32 bit? We have had several patches and enhancements related to the handling of binaries - one bug in particular had to do with the file being incorrectly loaded into memory. 116MB is a large file so that sounds similar to the behavior that you are seeing.

This one in particular comes to mind:

RX-16383 - Out of memory when importing binary data.

That said corruption of the 12MB file sounds strange to me. What’s going on in the server log during save items call?

Not to state the obvious, but you really only want to include the binary in the load item call if you need the file. If you set IncludeBinary to false it should not cause the binary to be loaded as part of the request.

On the API, they don’t use the same outer SOAP API, but internally they do use the same underlying java code, and the same Content Editor and the extensions/rules applied to fields are executed. Also, I am assuming that you were you able to change the maxfilesize param in the web.xml?

Thanks,

-n

[QUOTE=weikai;20641]Nate,

Thank you for the suggestions. I was able to upload and save the file correctly in content explorer so the problem is not on the server. Is content explorer using the same webservices API calls?

For certain reason, after I restarted the server, I was able to save the large file (12MB) but the file was corrupted and became 8MB. It looked like both loadItems and saveItems methods in webservices (server since it’s using RMI) have problems when handling large files. LoadItems will eat all available Heap and get out of memory error when loading a 116MB item. I have tried to increase the Heap to 2G but it did not help at all.[/QUOTE]

The application is copying content from one server to another server. Both of the servers are on 32bit Linux with MySQL. The default MaxFileSize was 20m and I have changed it too 100m (<param-name>maxFileSize</param-name><param-value>100m</param-value>) and restarted the server. It looked like it did not take effect. I’m still getting AxisFault exception at saveitems call if the size is bigger than 20MB. The 12MB file still behaves the same, the data is still corrupted and became 8MB. There were no error on server log when saving the items.

Also, PSFieldValue uses String to store binary data. How was the data converted from byte array to String? I was trying to save the binary data to a file after reading from the source server so I could make sure it was good before calling saveitems on anther instance but did not know what encode it used to do the converting.


if(items != null){
			PSItem item = items.get(0);		
			
			if(!idMapping.isLocked()){
				for(PSField field : Arrays.asList(item.getFields())){
					
					String type = field.getDataType().getValue();
					if(type.equals("binary")){
						File file = new File("log/test.png");			    	
				    	try{
						file.createNewFile();
						OutputStream output = new BufferedOutputStream(new FileOutputStream(file.getAbsoluteFile()));
						output.write(field.getPSFieldValue()[0].getRawData().getBytes());
						output.close();
				    	}
				    	catch(Exception e){
				    		log.error(e.getMessage());
				    	}
					}
				}
			}

WebServices code


public class PSFieldValue  implements java.io.Serializable {
    /* The string or encoded data if the mime type is not set to XML. */
    private java.lang.String rawData;

    private java.lang.String attachmentId;  // attribute

    public PSFieldValue() {
    }

    public PSFieldValue(
           java.lang.String rawData,
           java.lang.String attachmentId) {
           this.rawData = rawData;
           this.attachmentId = attachmentId;
    }

[QUOTE=natechadwick;20642]Hi Weikai,

What version and patch level are you running? Also, is it 64 or 32 bit? We have had several patches and enhancements related to the handling of binaries - one bug in particular had to do with the file being incorrectly loaded into memory. 116MB is a large file so that sounds similar to the behavior that you are seeing.

This one in particular comes to mind:

That said corruption of the 12MB file sounds strange to me. What’s going on in the server log during save items call?

Not to state the obvious, but you really only want to include the binary in the load item call if you need the file. If you set IncludeBinary to false it should not cause the binary to be loaded as part of the request.

On the API, they don’t use the same outer SOAP API, but internally they do use the same underlying java code, and the same Content Editor and the extensions/rules applied to fields are executed. Also, I am assuming that you were you able to change the maxfilesize param in the web.xml?

Thanks,

-n[/QUOTE]

Can you paste in the fault or error message that you are getting? Is this 6.7 / 7.0.3 / 7.1 / 7.2?

Also wonder if the limit is on the MySQL side. checkout this maximum-sizes-of-mysql-blob-fields post. Maybe confirm the data type on the backing table for that content type against those sizes and check mysql logs for errors?

This Java method worked for me in getting binary content out in a recent script. If the images are that large in size though you probably want to iterate through the reader with a buffer and write to a file output stream instead of loading it into a byte array.


   byte[] getBinaryData(ContentSOAPStub binding, PSItem item, String fieldName)
	      throws Exception
	   {
	      // Gets the attachment id from the value of the "item" field
	      String attachmentId = "";
	      for (PSField field : item.getFields())
	      {
	         if (field.getName().equals(fieldName))
	         {
	            PSFieldValue[] values = field.getPSFieldValue();
	            attachmentId = values[0].getAttachmentId();
	            break;
	         }
	      }
	    
	      // Retrieves the attachment with the attachment id
	      InputStream reader = null;
	      Object[] attachments = binding.getAttachments();
	      for (Object attachment : attachments)
	      {
	         AttachmentPart part = (AttachmentPart) attachment;
	         if (part.getContentId().equals(attachmentId))
	         {
	            reader = (InputStream) part.getContent();
	            break;
	         }
	      }

	      if(reader != null){
	      byte[] content = new byte[reader.available()];
	      reader.read(content);
	      return content;
	      }else
	    	  return null;
	   }

Maybe modify that last block to use the technique in link below, or some variation, just so that you don’t load the whole file in memory on the client side.

http://www.coderanch.com/t/278406//java/Writing-input-stream-file

Here was the code used when adding the attachment:

	   private String addAttachment(ContentSOAPStub binding, File attachment) throws MalformedURLException
	   {
		   URL url = attachment.toURL();
	      DataHandler handler = new DataHandler(new URLDataSource( url) );
	      AttachmentPart part = new AttachmentPart(handler);
	      part.setContentLocation(attachment.getAbsolutePath());
	      binding.addAttachment(part);	      
	      return part.getContentId();
	   }


Hope this helps.

-n

Thank you Nate! It think I got. The binary fields must be handled as attachment otherwise it will either throw up exception if bigger than max-file size or truncated to 8MB.

[QUOTE=natechadwick;20644]Can you paste in the fault or error message that you are getting? Is this 6.7 / 7.0.3 / 7.1 / 7.2?

Also wonder if the limit is on the MySQL side. checkout this maximum-sizes-of-mysql-blob-fields post. Maybe confirm the data type on the backing table for that content type against those sizes and check mysql logs for errors?

[/QUOTE]