Page 1 of 1
					
				Problem with URLDownloadToFile_ API Call?
				Posted: Tue Jul 12, 2005 2:27 pm
				by ebs
				I just tried a program I wrote in a previous version of PB under v3.93. This API call no longer works (it used to work fine):
Code: Select all
URLDownloadToFile_(0, "http://xoap.weather.com/weather/local/USNY0181?cc=*&dayf=1", "weather.xml", 0, 0)
It returns a value of -2147221020, which is $800401E4. It is supposed to return a value of 0 if successful.
The URL is OK, because I tried it in my browser and it returns the XML document as expected.
Can anyone reproduce this?
Thanks,
Eric
 
			
					
				
				Posted: Tue Jul 12, 2005 5:55 pm
				by El_Choni
				Doesn't work with that URL and 3.93 either. Works with other URLs.
			 
			
					
				
				Posted: Tue Jul 12, 2005 7:06 pm
				by MrMat
				It works ok here (XP SP2) but it looks like urlmon has some problems with gzip on some systems e.g. see 
->here<-.
 
			
					
				
				Posted: Tue Jul 12, 2005 9:30 pm
				by ebs
				Thanks for the replies!
The file I'm trying to download is just a text (XML) file, so I don't think the gzip bug applies in this situation.
To add to the confusion, I just tried the same thing (same API call, same URL) in Visual Basic 6 and it works fine!
Does anyone have other suggestions?
Regards,
Eric
			 
			
					
				
				Posted: Tue Jul 12, 2005 10:24 pm
				by dagcrack
				Try calling the function by your own, open the dll and call it from there.. if it works, it might be a bug in PB winapi lib ??? - use msdn !
			 
			
					
				
				Posted: Wed Jul 13, 2005 1:01 am
				by El_Choni
				Here a workaround:
Code: Select all
Procedure CheckError(value, message$, terminate)
  If value=0:MessageRequester("Error", message$):If terminate:End:EndIf:EndIf
EndProcedure
#INTERNET_FLAG_RELOAD = $80000000
#INTERNET_DEFAULT_HTTP_PORT = 80
#INTERNET_SERVICE_HTTP = 3
#HTTP_QUERY_FLAG_NUMBER = $20000000
#HTTP_QUERY_CONTENT_LENGTH = 5
#HTTP_QUERY_STATUS_CODE = 19
#HTTP_STATUS_OK = 200
#INTERNET_OPEN_TYPE_DIRECT = 1
Procedure Download(URL$, LocalFile$)
  Domain$ = RemoveString(Left(URL$, FindString(URL$, "/", 8)-1), "http://"):dwordSize = 4
  hInet = InternetOpen_("Mozilla/5.0 (Windows; U; Windows NT 5.1; es-ES; rv:1.7.8) Gecko/20050511 Firefox/1.0.4", #INTERNET_OPEN_TYPE_DIRECT, #NULL, #NULL, 0):CheckError(hInet, "Internet connection not available.", #TRUE)
  hURL = InternetOpenUrl_(hInet, URL$, #NULL, 0, #INTERNET_FLAG_RELOAD, 0):CheckError(hURL, "InternetOpenUrl_() failed", #TRUE)
  hInetCon = InternetConnect_(hInet, Domain$, #INTERNET_DEFAULT_HTTP_PORT, #NULL, #NULL, #INTERNET_SERVICE_HTTP, 0, 0):CheckError(hInetCon, "Unable to connect to "+Domain$, #TRUE)
  hHttpOpenRequest = HttpOpenRequest_(hInetCon, "HEAD", RemoveString(URL$, "http://"+Domain$+"/"), "http/1.0", #NULL, 0, #INTERNET_FLAG_RELOAD, 0):CheckError(hHttpOpenRequest, "Http open request to "+Domain$+" failed", #TRUE)
  CheckError(HttpSendRequest_(hHttpOpenRequest, #NULL, 0, 0, 0), "Http send request to "+Domain$+" failed.", #TRUE)
  CheckError(HttpQueryInfo_(hHttpOpenRequest, #HTTP_QUERY_FLAG_NUMBER|#HTTP_QUERY_STATUS_CODE, @sCode, @dwordSize, @lpdwIndex), "Http query failed.", #FALSE)
  CheckError(sCode=#HTTP_STATUS_OK, "Status code query failed.", #FALSE)
  CheckError(HttpQueryInfo_(hHttpOpenRequest, #HTTP_QUERY_FLAG_NUMBER|#HTTP_QUERY_CONTENT_LENGTH, @sCode, @dwordSize, @lpdwIndex), "CONTENT_LENGTH query failed.", #FALSE)
  If sCode:DataBufferLength = sCode:Else:DataBufferLength = 4096:EndIf
  *DataBuffer = AllocateMemory(DataBufferLength):CheckError(*DataBuffer, "Not enough memory.", #TRUE)
  CheckError(CreateFile(0, LocalFile$), "Unable to create file.", #TRUE)
  Repeat
    CheckError(InternetReadFile_(hURL, *DataBuffer, DataBufferLength, @Bytes), "Download failed.", #TRUE)
    If Bytes:WriteData(*DataBuffer, Bytes):EndIf
  Until Bytes=0
  CloseFile(0):FreeMemory(*DataBuffer):InternetCloseHandle_(hInetCon):InternetCloseHandle_(hURL):InternetCloseHandle_(hInet)
EndProcedure
Download("http://xoap.weather.com/weather/local/USNY0181?cc=*&dayf=1", "weather.xml")
 
			
					
				
				Posted: Wed Jul 13, 2005 1:48 pm
				by ebs
				El Choni,
Thank you for the code - it works perfectly. I've actually used it before, when I made a podcast downloader.
I used your code that included the progress indicator, so I guess that I owe you THANKS X 2! 
 
 
I'm still puzzled as to why the URLDownloadToFile_ call stopped working.
Regards,
Eric
 
			
					
				
				Posted: Wed Jul 13, 2005 9:33 pm
				by PB
				> I'm still puzzled as to why the URLDownloadToFile_ call stopped working.
It hasn't.  It's working fine as you'll see if you test with a different URL:
Code: Select all
Debug URLDownloadToFile_(0, "http://www.purebasic.com/images/logopb.gif", "c:\logopb.gif", 0, 0)
Obviously it's just having problems finding your weather file, that's all.  

Perhaps it only works correctly with direct paths to files, instead of dynamic?
 
			
					
				
				Posted: Wed Jul 13, 2005 10:18 pm
				by fweil
				I guess that's the truth ... using dynamic pages won't work using URLDownload ... but lower API level.
Code: Select all
;
; From El_Choni
; http://forums.purebasic.com/english/viewtopic.php?t=15891
;
Enumeration
  #File
EndEnumeration
#INTERNET_FLAG_RELOAD = $80000000
#INTERNET_DEFAULT_HTTP_PORT = 80
#INTERNET_SERVICE_HTTP = 3
#HTTP_QUERY_FLAG_NUMBER = $20000000
#HTTP_QUERY_CONTENT_LENGTH = 5
#HTTP_QUERY_STATUS_CODE = 19
#HTTP_STATUS_OK = 200
#INTERNET_OPEN_TYPE_DIRECT = 1
Procedure CheckError(value, sMessage.s, terminate)
  If value = 0
      Debug "Error : " + sMessage
      If terminate
          End
      EndIf
  EndIf
EndProcedure
Procedure Internet_Download_to_File(URL.s, FileName.s)
  If URLDownloadToFile_(#NULL, URL, FileName, #NULL, #NULL) <> 0
      Debug "Using low level API code"
      Domain.s = RemoveString(Left(URL, FindString(URL, "/", 8) - 1), "http://")
      dwordSize = 4
      hInet = InternetOpen_("Mozilla/5.0 (Windows; U; Windows NT 5.1; es-ES; rv:1.7.8) Gecko/20050511 Firefox/1.0.4", #INTERNET_OPEN_TYPE_DIRECT, #NULL, #NULL, 0)
      CheckError(hInet, "Internet connection not available.", #TRUE)
      hURL = InternetOpenUrl_(hInet, URL, #NULL, 0, #INTERNET_FLAG_RELOAD, 0)
      CheckError(hURL, "InternetOpenUrl_() failed", #TRUE)
      hInetCon = InternetConnect_(hInet, Domain, #INTERNET_DEFAULT_HTTP_PORT, #NULL, #NULL, #INTERNET_SERVICE_HTTP, 0, 0)
      CheckError(hInetCon, "Unable to connect to " + Domain, #TRUE)
      hHttpOpenRequest = HttpOpenRequest_(hInetCon, "HEAD", RemoveString(URL, "http://" + Domain + "/"), "http/1.0", #NULL, 0, #INTERNET_FLAG_RELOAD, 0)
      CheckError(hHttpOpenRequest, "Http open request to " + Domain + " failed", #TRUE)
      CheckError(HttpSendRequest_(hHttpOpenRequest, #NULL, 0, 0, 0), "Http send request to " + Domain + " failed.", #TRUE)
      CheckError(HttpQueryInfo_(hHttpOpenRequest, #HTTP_QUERY_FLAG_NUMBER | #HTTP_QUERY_STATUS_CODE, @sCode, @dwordSize, @lpdwIndex), "Http query failed.", #FALSE)
      CheckError(sCode = #HTTP_STATUS_OK, "Status code query failed.", #FALSE)
      CheckError(HttpQueryInfo_(hHttpOpenRequest, #HTTP_QUERY_FLAG_NUMBER | #HTTP_QUERY_CONTENT_LENGTH, @sCode, @dwordSize, @lpdwIndex), "CONTENT_LENGTH query failed.", #FALSE)
      If sCode
          DataBufferLength = sCode
        Else
          DataBufferLength = 4096
      EndIf
      *DataBuffer = AllocateMemory(DataBufferLength)
      CheckError(*DataBuffer, "Not enough memory.", #TRUE)
      CheckError(CreateFile(0, FileName), "Unable to create file.", #TRUE)
      Repeat
        CheckError(InternetReadFile_(hURL, *DataBuffer, DataBufferLength, @Bytes), "Download failed.", #TRUE)
        If Bytes
            WriteData(*DataBuffer, Bytes)
        EndIf
      Until Bytes=0
      CloseFile(0)
      FreeMemory(*DataBuffer)
      InternetCloseHandle_(hInetCon)
      InternetCloseHandle_(hURL)
      InternetCloseHandle_(hInet)
    Else
      Debug "Using URLDownloadToFile_() API code"
  EndIf
EndProcedure
;
;
;
;  URL.s = "http://xoap.weather.com/weather/local/USNY0181?cc=*&dayf=1"
;  URL.s = "http://forums.purebasic.com/english/viewtopic.php?t=15891"
;  URL.s = "http://www.paroles.net/"
;  URL.s = "http://www.voila.fr/PagesJaunes/"
  URL.s = "http://www.societe.com/cgi-bin/liste?nom=cl+marketing&dirig=&pre=&ape=&dep=&image2.x=0&image2.y=0"
  FileName.s = "CacheFile.txt"
  Internet_Download_to_File(URL, FileName)
  If ReadFile(#File, FileName)
      Repeat
        a$ = ReadString()
        Debug a$
      Until Eof(#File)
      CloseFile(#File)
  EndIf
  DeleteFile(FileName)
End
If you try the # URLs proposed in the URL.s list you will have according answers.
This is anyway interesting to go further on this to me.
Rgrds
 
			
					
				
				Posted: Wed Jul 13, 2005 10:50 pm
				by ebs
				PB wrote:
Obviously it's just having problems finding your weather file, that's all.  
 
 
I think that there's more to it than that:
1. The same code used to work when compiled with an earlier version of PB.
2. Visual Basic code using the same API call and URL works fine.
I will check tomorrow to see if I still have the compiled EXE file from the earlier version, and if that still works.
Eric
 
			
					
				
				Posted: Fri Feb 17, 2006 7:45 pm
				by mike74
				ebs wrote:
I think that there's more to it than that:
1. The same code used to work when compiled with an earlier version of PB.
2. Visual Basic code using the same API call and URL works fine.
I will check tomorrow to see if I still have the compiled EXE file from the earlier version, and if that still works.
Eric
Eric,
Did you ever learn anything else about this?  I would like to use urlmon in 3.94 and 4.0 and it would be nice to have a heads up if there are known problems.
Thanks,
Mike
edit: realized I wrote 3.93 where I meant 3.94
 
			
					
				
				Posted: Fri Feb 17, 2006 9:00 pm
				by ebs
				Mike,
I really didn't pursue it much further. fweil's explanation regarding problems with dynamic
web pages and URLDownloadToFile might be the answer.
I used a slightly modified version of El Choni's low-level API code instead.
It still puzzles me, though, as to why it worked in an earlier version of PB.
Regards,
Eric