Page 1 of 2
Login to HTTPS site, use cookie to get more pages
Posted: Wed Oct 21, 2015 2:23 pm
by Keya
Hello,
sorry if this has already been asked and answered but i didnt have much luck with my searches (woeful search wednesday) and most demos i found were quite old and not working with current PB, also im not familiar with requesting HTTPS in PB but if im not mistaken recent builds added TLS/SSL support so there might be newer/easier ways to do this than the old demos showed anyway? but the word HTTPS only exists twice in the helpfile and TLS only seems to apply to SendMail so im not sure
Basically all i want to do is sent a request to login to a HTTPS website with username+password, and then i'm
guessing i need to store the Cookie it sends back in the header, which I can then use when i make further requests for pages now that im logged in? hope i've got that right anyway!
I only really need to do this in Windows as its just a personal project to make my shopping easier lol, but if PB can do it cross-platform thats a great bonus

Re: Login to HTTPS site, use cookie to get more pages
Posted: Wed Oct 21, 2015 4:24 pm
by deseven
Yes, the latest PB (5.40) have libcurl included so you can easily get https headers like that:
Code: Select all
InitNetwork()
Debug GetHTTPHeader("https://google.com")
Then just grab the Set-Cookie headers and you're good to go.
However, as you may need to use those cookies later in your requests that requires a bit more work.
Grab
this and check the example.
Re: Login to HTTPS site, use cookie to get more pages
Posted: Wed Oct 21, 2015 4:55 pm
by Keya
LIBCURL, that's the magic search term i was after! quite a few more searches to look at now heehee, including yours desevens (thankyou!), which i notice includes "curl_easy_setopt(curl,#CURLOPT_COOKIE,@cookie)"
Fred THANKYOU for adding secure web + email to PB!!!
ok, now i know the entrance to the maze, here i go!
Re: Login to HTTPS site, use cookie to get more pages
Posted: Wed Oct 21, 2015 5:05 pm
by deseven
Just two more things you may need to know:
1. This is not threadsafe. Use mutexes or semaphores if you need to grab different data in different threads.
2. If you disable the unicode mode you should avoid the usage of the str2curl() procedure.
Re: Login to HTTPS site, use cookie to get more pages
Posted: Wed Oct 21, 2015 7:35 pm
by Keya
getting there! ... this is using COOKIEFILE (for persistence across requests if im understanding correct), and also an enumeration of all the cookies sent by the server via COOKIELIST, which isn't really required for this but nice for debugging

(based on
http://curl.haxx.se/libcurl/c/cookie_interface.html)
Code: Select all
;Before request...
cookie_file.s = "cookie.txt"
curl_easy_setopt(curl,#CURLOPT_COOKIEFILE, @cookie_file)
curl_easy_setopt(curl,#CURLOPT_COOKIEJAR, @cookie_file)
;After request...
*cookies.Curl_Slist
*nextcookie.Curl_Slist
If curl_easy_getinfo(curl,#CURLINFO_COOKIELIST,@cookies) = #CURLE_OK
*nextcookie = cookies
While *nextcookie
Debug("Cookie=" + PeekS(*nextcookie\Data, -1, #PB_Ascii)) ;for both Unicode and Ascii compiles
*nextcookie = *nextcookie\next_
Wend
EndIf
;Then before our next request we can set them... (but perhaps this isnt necessary if curl handles that persistence, im not sure)
If curl_easy_setopt(curl, #CURLOPT_COOKIELIST, *cookies) = #CURLE_OK
EndIf
;Cleanup
curl_slist_free_all(*cookies)
curl_slist_free_all(*nextcookie)
and a simple mod to str2curl to make it Unicode+Ascii friendly, probably not very efficient but easier to work with the existing code!:
Code: Select all
Procedure.s str2curl(string.s)
CompilerIf #PB_Unicode <> 1
Protected *curlstring
If *curlstring : FreeMemory(*curlstring) : EndIf
*curlstring = AllocateMemory(Len(string) + 1)
PokeS(*curlstring,string,-1,#PB_Ascii)
ProcedureReturn PeekS(*curlstring,-1)
CompilerElse
ProcedureReturn string
CompilerEndIf
EndProcedure
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 6:40 am
by Keya
Success
deseven i'm just using your
libcurl.pbi as-is, and your libcurl
example was a very solid base for me to tweak for this purpose, thankyou very much!
My specific needs were then made very easy by libcurl simply by using CURLOPT_COOKIEFILE (/COOKIEJAR) which it maintains internal persistence with, unless you tell it otherwise, so there's really not much work on my part heehee
Code: Select all
EnableExplicit
Global hcurl.i
InitNetwork()
IncludeFile "libcurl.pbi" ;https://github.com/deseven/pbsamples/blob/master/crossplatform/libcurl/libcurl.pbi
Procedure.i _CURL_INIT()
Protected cookie_file.s = "cookie.txt"
hcurl = curl_easy_init()
If hcurl
Protected agent.s = str2curl("Mozilla/5.0")
curl_easy_setopt(hcurl,#CURLOPT_IPRESOLVE,#CURL_IPRESOLVE_V4)
curl_easy_setopt(hcurl,#CURLOPT_COOKIEJAR, @cookie_file)
curl_easy_setopt(hcurl,#CURLOPT_COOKIEFILE, @cookie_file)
curl_easy_setopt(hcurl,#CURLOPT_USERAGENT,@agent)
curl_easy_setopt(hcurl,#CURLOPT_TIMEOUT,40)
curl_easy_setopt(hcurl,#CURLOPT_FOLLOWLOCATION,1)
curl_easy_setopt(hcurl,#CURLOPT_MAXREDIRS,10)
curl_easy_setopt(hcurl,#CURLOPT_WRITEFUNCTION,@curlWriteData())
Protected *header, *Headers, header.s = str2curl("Cache-Control: no-cache")
*header = curl_slist_append(*Headers,header)
curl_easy_setopt(hcurl,#CURLOPT_HTTPHEADER,*header)
EndIf
ProcedureReturn hcurl
EndProcedure
Procedure _CURL_CLOSE(hcurl.i)
curl_easy_cleanup(hcurl)
EndProcedure
Procedure.i LoginSite()
Protected url.s = str2curl("https://www.site.com/login.php")
Protected post.s = str2curl("email=my@email.com&password=s3cr3t")
Protected resHTTP.i
curl_easy_setopt(hcurl,#CURLOPT_POST,1) ;'POST' request
curl_easy_setopt(hcurl,#CURLOPT_URL,@url)
curl_easy_setopt(hcurl,#CURLOPT_POSTFIELDS,@post)
Protected res.i = curl_easy_perform(hcurl)
Protected resData.s = curlGetData()
curl_easy_getinfo(hcurl,#CURLINFO_RESPONSE_CODE,@resHTTP)
Debug "result: " + Str(res)
Debug "HTTP code: " + Str(resHTTP)
Debug "HTTP data: " + #CRLF$ + resData
EndProcedure
Procedure.i SearchSite(sFind.s)
Protected url.s = str2curl("https://www.site.com/search.php?SearchTerm=" + sFind)
Protected resHTTP.i
curl_easy_setopt(hcurl,#CURLOPT_POST,0) ;'GET' request
curl_easy_setopt(hcurl,#CURLOPT_URL,@url)
Protected res = curl_easy_perform(hcurl)
Protected resData.s = curlGetData()
curl_easy_getinfo(hcurl,#CURLINFO_RESPONSE_CODE,@resHTTP)
Debug "result: " + Str(res)
Debug "HTTP code: " + Str(resHTTP)
Debug "HTTP data: " + #CRLF$ + resData
EndProcedure
;1. Init CURL
hcurl = _CURL_INIT()
If hcurl = 0
Debug("CURL INIT FAILED")
End
EndIf
;2. Login to the site
LoginSite()
;3. Now we're logged in, do a search or whatever
SearchSite("my+search+terms")
;4. Close CURL when finished
_CURL_CLOSE(hcurl)
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 7:59 am
by deseven
There is a little error in my code:
Code: Select all
*header = curl_slist_append(*Headers,header)
; should be
*header = curl_slist_append(0,header)
That doesn't really matter since *Headers is null anyway, you can leave it as it is. However you don't need to set any headers at all, i put it in my example just to show all of the possibilities.
Actually most of the options in my example aren't mandatory, the most basic request can look like that:
Code: Select all
curl_easy_setopt(curl,#CURLOPT_URL,@url)
curl_easy_setopt(curl,#CURLOPT_WRITEFUNCTION,@curlWriteData())
res = curl_easy_perform(curl)
And also that pbi is not mine, i found it somewhere on forums and modified a bit, don't remember the topic.
Good luck!
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 3:40 pm
by Keya
GZIP-compressed ... it's apparently supposed to be easy to get libcurl to do all the work here, we just tell it to use gzip encoding and it's supposed to transparently decode it for us
But for some reason it's only returning the still-compressed form of the data... there's no decompression, and i can't really see where it would do that either (there's nothing in the WriteFunction for example)
I also rewrote the WriteFunction so that it's binary-friendly, required for compressed/non-string data, and it seems that function is working fine, at least from my testing of it using uncompressed data
The following code should be a full working demo for the
common httpbin.org/gzip test, simply downloading the small file:
http://httpbin.org/gzip
No HTTPS, just normal HTTP, with Accept-Encoding: gzip
But i can't see what i'm doing wrong!

... it's returning the compressed data fine, but not decompressing it which its supposed to do automagically
Code: Select all
CompilerIf #PB_Unicode = 1
MessageRequester("Error","Ascii mode compile only")
End
CompilerEndIf
EnableExplicit
Global hcurl.i
InitNetwork()
IncludeFile "libcurl.pbi"
Global *curlmem, curlmemsize.i
ProcedureC curlWriteBinData(*ptr, Size, NMemB, *Stream) ;based on http://curl.haxx.se/libcurl/c/getinmemory.html (binary-friendly)
curlmemsize = curlmemsize + (size * nmemb)
*curlmem = ReAllocateMemory(*curlmem, curlmemsize+8)
If *curlmem = 0
Debug("ERROR - ReAllocMem failed")
ProcedureReturn 0
EndIf
CopyMemory(*ptr, *curlmem + curlmemsize - (size * nmemb), (size * nmemb))
ProcedureReturn size * nmemb
EndProcedure
Procedure.i CURL_GetPage()
Protected resHTTP.i
hcurl = curl_easy_init()
If hcurl
Protected *versioninfo = curl_version()
Debug("curl version=" + PeekS(*versioninfo))
curl_easy_setopt(hcurl,#CURLOPT_POST,0)
curl_easy_setopt(hcurl,#CURLOPT_URL,@"http://httpbin.org/gzip")
curl_easy_setopt(hcurl,#CURLOPT_ENCODING,@"gzip")
curl_easy_setopt(hcurl,#CURLOPT_WRITEFUNCTION,@curlWriteBinData())
curl_easy_setopt(hcurl,#CURLOPT_HTTP_CONTENT_DECODING,1)
Protected res = curl_easy_perform(hcurl)
curl_easy_getinfo(hcurl,#CURLINFO_RESPONSE_CODE,@resHTTP)
Debug "curl result: " + Str(res)
Debug "HTTP code: " + Str(resHTTP)
ShowMemoryViewer(*curlmem, curlmemsize) ;shows the still-gzipped data
EndIf
ProcedureReturn hcurl
EndProcedure
hcurl = CURL_GetPage()
If hcurl = 0
Debug("CURL INIT FAILED")
End
Else
curl_easy_cleanup(hcurl)
EndIf
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 5:11 pm
by deseven
That's a typical PB behavior - take a useful lib, strip it to the core and leave it like that.
Seems that there is no gzip support in the internal libcurl.
There are 2 options for you then:
1. Grab the zlib library and
uncompress the data manually.
2. Use the external libcurl. For that i have
an older example which i used some time ago. It's kinda messy, but i hope you'll figure it out.
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 5:29 pm
by Keya
Here is breakdown of \Purebasic5.40-x86-Win\PureLibraries\Windows\Libraries\libcurl.lib (~714kb). The compiled PB exe is around 190kb. Yes it doesnt seem zlib is part of it

thankyou for pointing that out because it was not an angle i had even remotely considered!
zlib.lib (~100kb) is another PB include though so perhaps i can use that? (im trying to avoid DLLs because so far all the code is cross-platform so im trying to keep it that way!)
\Purebasic5.40-x86-Win\PureLibraries\Windows\Libraries\libcurl.lib:
Code: Select all
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/base64.obj 3350
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/bundles.obj 2047
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/conncache.obj 4322
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/connect.obj 17710
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/content_encoding.obj 407
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/cookie.obj 19511
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_addrinfo.obj 3331
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_endian.obj 1831
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_fnmatch.obj 5893
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_gethostname.obj 766
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_memrchr.obj 644
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_ntlm.obj 3450
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_ntlm_core.obj 3506
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_ntlm_msgs.obj 403
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_sasl.obj 11375
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_des.obj 689
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/dotdot.obj 2542
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/easy.obj 10035
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/escape.obj 3521
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/fileinfo.obj 1026
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/formdata.obj 17437
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/getenv.obj 1142
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/getinfo.obj 5595
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/hash.obj 5354
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/hmac.obj 1640
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/hostip.obj 10240
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/hostip6.obj 2216
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/hostsyn.obj 1301
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http.obj 49848
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http_chunks.obj 4299
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http_digest.obj 2819
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http_proxy.obj 12008
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/if2ip.obj 1168
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/llist.obj 2707
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/md5.obj 3073
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/mprintf.obj 12583
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/multi.obj 29252
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/netrc.obj 3171
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/nonblock.obj 695
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/parsedate.obj 9851
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/pipeline.obj 5424
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/pingpong.obj 5990
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/progress.obj 8922
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/rawstr.obj 5496
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/select.obj 3540
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/sendf.obj 8708
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/share.obj 2993
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/slist.obj 2020
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/socks.obj 14226
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/speedcheck.obj 1473
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/splay.obj 2482
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/smtp.obj 25432
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/strdup.obj 1021
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/strequal.obj 961
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/strerror.obj 54013
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/timeval.obj 1639
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/transfer.obj 22165
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/url.obj 67270
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/version.obj 2705
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/warnless.obj 4383
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/wildcard.obj 1167
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/vtls/vtls.obj 15189
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/inet_ntop.obj 2191
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_sasl_gssapi.obj 407
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_sasl_sspi.obj 11920
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/curl_sspi.obj 3247
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http_negotiate.obj 403
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/http_negotiate_sspi.obj 4239
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/inet_pton.obj 2208
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/strtok.obj 772
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/socks_gssapi.obj 403
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/socks_sspi.obj 11886
c:/purebasic/svn/v5.40/build/x86/libcurl/lib/vtls/schannel.obj 25764
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 5:36 pm
by deseven
Keya wrote:im trying to avoid DLLs because so far all the code is cross-platform
But it's usually safe to use the external libcurl. On Windows you can provide the full distro, on Mac it's a part of a system, on Linux i think 99% of the desktop distros have it.
In other words - my second example should work on all platforms.
And yes, you can use static zlib to do that.
Re: Login to HTTPS site, use cookie to get more pages
Posted: Thu Oct 22, 2015 7:41 pm
by Keya
Nearly there!! This demo uses static ZLIB (it only adds about 20kb to the windows exe) and supports both GZIP and DEFLATE methods, both demonstrated (just comment one/uncomment the other)

Thankyou to everyone who contributed to the code i've used (i just searched the forums), i still need to add credits
However it only supports Windows 32bit at the moment
It uses the ZLIB library though, so we should be able to make it support all 3 OS + 64bit ??
I'm going to need some help there, please! I think 99% of the code is complete, we just need to get the right function declares?
Code: Select all
CompilerIf #PB_Unicode = 1
MessageRequester("Error","Ascii mode compile only")
End
CompilerEndIf
EnableExplicit
CompilerIf #PB_Compiler_OS = #PB_OS_Windows
ImportC "zlib.lib"
inflateInit2(s,wb,v,ss) As "_inflateInit2_@16"
inflate(s,f) As "_inflate@8"
inflateEnd(s) As "_inflateEnd@4"
EndImport
CompilerElse
ImportC "-lz"
;Todo
EndImport
CompilerEndIf
IncludeFile "libcurl.pbi"
Global hcurl.i, *curlmem, curlmemsize.i
CompilerIf #PB_Compiler_Processor = #PB_Processor_x64 And #PB_Compiler_OS <> #PB_OS_Windows
; On Linux x64, a long is 8 byte (unlike Windows x64)
Structure z_stream
*next_in.BYTE
avail_in.l
pad.l
total_in.i ; uLong
*next_out.BYTE
avail_out.l
pad2.l
total_out.i ; uLong
*msg.BYTE
*state
zalloc.i
zfree.i
opaque.i
data_type.l
pad3.l
adler.i ; uLong
reserved.i ; uLong
EndStructure
CompilerElse
Structure z_stream
*next_in.BYTE
avail_in.l
total_in.l ; uLong
*next_out.BYTE
avail_out.l
total_out.l ; uLong
*msg.BYTE
*state
zalloc.i
zfree.i
opaque.i
data_type.l
adler.l ; uLong
reserved.l ; uLong
; without this, the inflateInit2() fails with a version error
CompilerIf #PB_Compiler_Processor = #PB_Processor_x64
alignment.l
CompilerEndIf
EndStructure
CompilerEndIf
Procedure.s ungz(gz) ; returns uncompressed-GZIP string; (!) no error-checking
Protected un,s.z_stream
un=AllocateMemory(1000000) ;TODO! adjust
s\next_in=gz
s\next_out=un
s\avail_in=MemorySize(gz)
s\avail_out=MemorySize(un)
;s\zalloc=#Null
;s\zfree=#Null
;s\opaque=#Null
s\data_type=1
inflateInit2(@s,15+32,@"1.2.3",SizeOf(s)) ;15+32=ZLIB(deflate)+GZIP, 15+16=just GZIP
inflate(@s,4):inflateEnd(@s)
ProcedureReturn(PeekS(un))
EndProcedure
ProcedureC curlWriteBinData(*ptr, Size, NMemB, *Stream)
curlmemsize + (size * nmemb)
*curlmem = ReAllocateMemory(*curlmem, curlmemsize+8)
If *curlmem = 0
Debug("ERROR - ReAllocMem failed")
ProcedureReturn 0
EndIf
CopyMemory(*ptr, *curlmem + curlmemsize - (size * nmemb), (size * nmemb))
ProcedureReturn size * nmemb
EndProcedure
Procedure.i CURL_GetPage()
Protected resHTTP.i
hcurl = curl_easy_init()
If hcurl
;GZIP compression
curl_easy_setopt(hcurl,#CURLOPT_URL,@"http://httpbin.org/gzip")
curl_easy_setopt(hcurl,#CURLOPT_ENCODING,@"gzip")
;DEFLATE compression
;curl_easy_setopt(hcurl,#CURLOPT_URL,@"http://httpbin.org/deflate")
;curl_easy_setopt(hcurl,#CURLOPT_ENCODING,@"deflate")
curl_easy_setopt(hcurl,#CURLOPT_WRITEFUNCTION,@curlWriteBinData())
curl_easy_setopt(hcurl,#CURLOPT_HTTP_CONTENT_DECODING,1)
Protected res = curl_easy_perform(hcurl)
curl_easy_getinfo(hcurl,#CURLINFO_RESPONSE_CODE,@resHTTP)
Debug "curl result: " + Str(res)
Debug "HTTP code: " + Str(resHTTP)
Debug("Size=" + Str(curlmemsize))
Debug("UNPACKED=" + ungz(*curlmem))
EndIf
ProcedureReturn hcurl
EndProcedure
InitNetwork()
hcurl = CURL_GetPage()
If hcurl = 0
Debug("CURL INIT FAILED"): End
Else
curl_easy_cleanup(hcurl)
EndIf
Re: Login to HTTPS site, use cookie to get more pages
Posted: Fri Oct 23, 2015 5:44 pm
by JHPJHP
Hi Keya,
Nice work, persistence pays off.
When you first posted your question I started writing my latest addition to
Services, Stuff, and Shellhook ( Stuff\MoreStuff\Download\
Download.pb, Download.ico, Download.vbs ). By the time I finished the script you had taken a different route, but maybe it can still be useful?
Cheers!
Re: Login to HTTPS site, use cookie to get more pages
Posted: Fri Oct 23, 2015 8:48 pm
by Keya
JHPJHPJHP (two is a crowd three is a party), thankyou for your download.pb demo! btw its accompanying .vbs script isn't required and is just an additional demonstration of alternative method, yes?
after close review it appears to be a perfect (IMO!) example of using raw HTTP headers for GET/POST requests, username&password login, with cookie for persistence, and using only PB's network functions (not WinAPI), so it's cross-platform too. All excellent!!!

have i got all that right?
But does it do HTTPS? That's why i went with libcurl (and direct libcurl API instead of PB statements), but deseven's two-line demo in this threads 2nd post shows https support with native PB statements because Fred added libcurl in 5.40, so im guessing your code is actually HTTPS-compatible?
So, perhaps i should be using your method instead?
I am very happy with progress though, and hopefully this will make it easier for others in future so they won't have to worry about all the encryption and compression overhead which i find no fun and can just focus on the real task at hand - working with the HTML
so HTTPS is made easy thanks to LIBCURL.LIB, and GZIP & DEFLATE support is made easy thanks to ZLIB.LIB...
Both libraries support Linux + Mac + Windows, 32 + 64bit, so we're nearly there!

the HTTPS part it seems is fully cross-OS already, just need to get ZLIB done now. I was hoping to get stuck into that today but having a really bad day

tomorrow i hope
Re: Login to HTTPS site, use cookie to get more pages
Posted: Fri Oct 23, 2015 9:33 pm
by JHPJHP
Hi Keya,
Sorry to hear you're having a bad day.
You're correct that the VBScript is there for demonstration / comparison purposes only.
Again you're mostly correct about the script being cross-platform. I included a "cheat-method" to avoid window "Not Responding' messages, but this could have been done without APIs.
- I also think the
#WM_CLOSE constant in the callback is Windows only
Code: Select all
Protected Message.MSG
If PeekMessage_(@Message, #Null, 0, 0, #PM_REMOVE)
TranslateMessage_(@Message)
DispatchMessage_(@Message)
EndIf
It should be HTTPS compatible, but I won't commit to that until I've fully tested it.
Deciding on which method to use would probably be better determined after you finished your current script; the community will benefit by your efforts.
Feel better.