Try our conversational search powered by Generative AI!

Upload file in code to other page's page folder

Vote:
 

How do I go about to upload a file to another page's page folder in code? I can't figure out how to do this. The pages are created in the same piece of code and I've got a set of pictures belonging to each and every page being created and I thought the page folder would be a great place to store these.

Can anybody help me out with this one?

#49559
Mar 23, 2011 13:42
Vote:
 
This code access another page's page files
if (UploadFiles1.Files.Count > 0)
{
    page = EPiServer.DataFactory.Instance.GetPage(pageRef) as FaqItemPageType;

    var folder = GetPageDirectory(page,true);
    folder.BypassAccessCheck = true;
    folder=folder.CreateSubdirectory("from_user");
    folder.BypassAccessCheck = true;
    foreach (string fileName in UploadFiles1.Files)
    {
        UnifiedFile fileInTemp = HostingEnvironment.VirtualPathProvider.GetFile(fileName) as UnifiedFile;
        fileInTemp.BypassAccessCheck = true;
        try
        {
            fileInTemp.MoveTo(folder.VirtualPath + fileInTemp.Name);
        }
        catch { }

    }
    folder.BypassAccessCheck = false;
}

    

#49562
Edited, Mar 23, 2011 13:55
Vote:
 

I found a working code but get a strange error message. It says "This stream does not support seek operations.". After I create the page I used the PageReference of the new page to find it's page folder. The file is created but the size of it is 0 byte. Any idea what's wrong with the stream?

Here's the code I use. It crashes on the line byte[] buffer = new byte[fileContent.Length];.

PageReference newPageRef = DataFactory.Instance.Save(newPage, SaveAction.Publish, AccessLevel.NoAccess);
               
Stream objImgStream = new WebClient().OpenRead("http://www.somedomain.com/somefile.jpg");
UploadFile(newPageRef, objImgStream , "logotype.jpg");

protected void UploadFile(PageReference pageRef, Stream fileContent, string fileName)
{
    PageData page = DataFactory.Instance.GetPage(pageRef);
    UnifiedDirectory dir = page.GetPageDirectory(true);
    UnifiedFile uFile = dir.CreateFile(fileName);
    Stream s = uFile.Open(FileMode.CreateNew);

    byte[] buffer = new byte[fileContent.Length];
    int numBytesToRead = (int)fileContent.Length;

    fileContent.Read(buffer, 0, numBytesToRead);

    s.Write(buffer, 0, buffer.Length);
    s.Close();

    fileContent.Close();
}

#49566
Mar 23, 2011 14:34
Vote:
 

Hi Peter!

The Stream implementation returned by WebClient.OpenRead() is of type ConnectStream, and a quick look into its implementation reveals this code:

public override long Length
{
    get
    {
        throw new NotSupportedException(SR.GetString("net_noseek"));
    }
}
 

What You could do instead, is to retrieve the Content-Length of the HttpWebResponse instead, and then pass it as a parameter to the UploadFile method, something like this:

WebClient wc = new WebClient();
Stream sw = wc.OpenRead("http://www.episerver.com");
long contentLength = long.Parse(wc.ResponseHeaders["Content-Length"]);

BUT, you should make sure that you dont download too big files as the buffer required could quite easily throw you an OutOfMemory-exception!!!   

A safer approach would be to download the file in reasonable chunks and write them to the corresponding pagefolder file stream!

/johan

    

#49574
Edited, Mar 23, 2011 16:30
Vote:
 

The files are small pictures and they will be read one at the time so running out of memory will hopefully not be a problem. It's also a one time job that needs to be done so once executed the code will not be needed again.

So what do I do with the long? Pass it on to the function and replace these two lines?

byte[] buffer = new byte[fileContent.Length];
int numBytesToRead = (int)fileContent.Length;

with

byte[] buffer = new byte[contentLength];
int numBytesToRead = (int)contentLength;

#49575
Mar 23, 2011 16:43
Vote:
 

Yes.

/johan

#49576
Mar 23, 2011 16:46
Vote:
 

That code works but the images written to the page folder are empty... :/

#49577
Mar 23, 2011 16:49
Vote:
 

Acutally, the code to "chunk" the transfer would take just a few more lines, and then you'd be safe regardless of file sizes (atleast as long as there's enough disk to store them):

const int CHUNK_SIZE = 8192;

byte[] buffer = new byte[CHUNK_SIZE]; 
            
int bytesRead;
while(0 < (bytesRead = fileContent.Read(buffer, 0, CHUNK_SIZE)))
{
    s.Write(buffer, 0, bytesRead);
}
/johan
#49578
Mar 23, 2011 16:54
Vote:
 

I don't really know what lines of code to replace. Would you mind pasting it in its content?

#49579
Mar 23, 2011 16:57
Vote:
 

Sure, this would be the new UploadFile()-method:

protected void UploadFile(PageReference pageRef, Stream fileContent, string fileName)
{
    PageData page = DataFactory.Instance.GetPage(pageRef);
    UnifiedDirectory dir = page.GetPageDirectory(true);
    UnifiedFile uFile = dir.CreateFile(fileName);
    Stream s = uFile.Open(FileMode.CreateNew);

    const int CHUNK_SIZE = 8192;

    byte[] buffer = new byte[CHUNK_SIZE];
    int bytesRead;

    while(0 < (bytesRead = fileContent.Read(buffer, 0, CHUNK_SIZE))
    {
        s.Write(buffer, 0, bytesRead);
    }

    s.Close();

    fileContent.Close();
}

/johan

#49580
Edited, Mar 23, 2011 17:00
Vote:
 

Finally!! :) Took all afternoon but now the pictures made it "over" ok. Thanks a million!

#49581
Mar 23, 2011 17:08
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.